House Prices Regression Analysis

In [1]:
import os
import pandas as pd
import numpy as np
import warnings
from sklearn.preprocessing import OrdinalEncoder, MinMaxScaler
from sklearn.impute import KNNImputer, SimpleImputer
from sklearn.compose import ColumnTransformer
from sklearn.pipeline import make_pipeline
from sklearn.model_selection import train_test_split
from sklearn.metrics import mean_absolute_error
import plotly.graph_objects as go
from plotly.subplots import make_subplots
import plotly.express as px
import matplotlib.pyplot as plt
import matplotlib.gridspec as gridspec
import seaborn as sns
from imblearn.over_sampling import RandomOverSampler
from imblearn.under_sampling import RandomUnderSampler
import tensorflow as tf
import sys
import smogn

Part 1: Loading and Reading the Data

Data is loaded from the CSV files and read into DataFrames

In [30]:
seed = 1
tf.random.set_seed(seed)
warnings.filterwarnings("ignore", category=DeprecationWarning)
np.set_printoptions(threshold=sys.maxsize)
pd.set_option('display.max_rows', 1000)

wd = os.getcwd()
train_file = wd + '//train.csv'
valid_file = wd + '//test.csv'

train_df = pd.read_csv(train_file)
valid_df = pd.read_csv(valid_file)

Part 2: Data Exploratory Analysis

Part 2.1: Getting basic information, including NAs

Firstly, the counts, mean, standard deviation, min, quartile and max for numerical columns. From the counts of the training data it is observed that there is a maximum of 3700 rows where some rows have fewer, most likely from NAs which is to be confirmed below. Similarly, the validation dataset has a maximum of 200 rows with a few variables having less than that.

In [ ]:
train_df.describe()
In [ ]:
valid_df.describe()

Next, the number of blank rows (NAs) is tabularized by variable .

In [3]:
na_list = [] 
col_list = train_df.columns
for col in col_list:
    na_list.append(train_df[col].isna().sum())
    
na_df = pd.DataFrame({'Name':col_list,'Count':na_list})
na_df['Percentage'] = na_df['Count']/len(valid_df)
na_df['Percentage'] = na_df['Percentage'].apply(lambda x:'%.1f' %x)
na_df.sort_values('Percentage',ascending=False)
Out[3]:
Name Count Percentage
72 PoolQC 1452 1.0
74 MiscFeature 1404 1.0
6 Alley 1367 0.9
73 Fence 1177 0.8
57 FireplaceQu 690 0.5
... ... ... ...
27 ExterQual 0 0.0
26 MasVnrArea 8 0.0
25 MasVnrType 8 0.0
24 Exterior2nd 0 0.0
80 SalePrice 0 0.0

81 rows × 3 columns

It is observed that variable v16 has the highest number of NAs by a large margin and could be dropped from training if it is observed to contribute significantly to the inaccuracy of the model.

Part 2.2: Data distribution for all variables (numerical & categorical) using a Violin plot

In [4]:
num_col = list(train_df.describe().keys())
cat_col = [x for x in col_list if x not in num_col]
cat_col_bkp = cat_col

row_titles = ['Train','Valid']
no_row = len(row_titles)
no_num_col = len(col_list)
fig_subplot = make_subplots(rows=no_row,cols=no_num_col,row_titles=row_titles)
df_list = [train_df,valid_df]
#Train vs Validation rows
for i,j in enumerate(df_list):
    #Iterating through the variables
    for k,l in enumerate(j.columns):
        fig_subplot.append_trace(go.Violin(y=j[l].values,name=str(l),box_visible=False,meanline_visible=True,fillcolor='lightyellow',line_color='grey'),row=i+1,col=k+1)
fig_subplot.layout.update(showlegend=False,title_text = 'Figure 2.2: Violin Plot Data Distribution of Variables',width=12800,height=720)
fig_subplot.show()
In [5]:
## conduct smogn
train_smogn = smogn.smoter(
    
    ## main arguments
    data = train_df,           ## pandas dataframe
    y = 'SalePrice',          ## string ('header name')
    k = 9,                    ## positive integer (k < n)
    samp_method = 'extreme',  ## string ('balance' or 'extreme')

    ## phi relevance arguments
    rel_thres = 0.80,         ## positive real number (0 < R < 1)
    rel_method = 'auto',      ## string ('auto' or 'manual')
    rel_xtrm_type = 'high',   ## string ('low' or 'both' or 'high')
    rel_coef = 2.25           ## positive real number (0 < R)
)
train_smogn = train_smogn.reset_index(drop=True)

#Creating a combined train-validation dataset for imputation, scaling etc.
train_group_list = [0] * len(train_smogn) #Group 0 = train
valid_group_list = [1] * len(valid_df) #Group 1 = validation
train_group_df = pd.DataFrame(train_group_list,columns=['Group'])
valid_group_df = pd.DataFrame(valid_group_list,columns=['Group'])
#train_group_df = train_group_df.reset_index(drop=True)

train_df = pd.concat([train_smogn,train_group_df],axis=1)
valid_df = pd.concat([valid_df,valid_group_df],axis=1)

#Create a pipeline to turn categorical values to numerical
comb_df = pd.concat([train_df,valid_df],axis=0)
comb_df = comb_df.reset_index(drop=True) #Reseting the index after combining

#Adding the 'Group' column back into num_col    
if('Group' not in num_col):
    num_col.append('Group')
comb_cat_df = comb_df[cat_col]
comb_num_df = comb_df[num_col]
dist_matrix: 100%|#####################################################################| 70/70 [00:03<00:00, 18.02it/s]
synth_matrix: 100%|####################################################################| 70/70 [00:14<00:00,  4.89it/s]
r_index: 100%|#########################################################################| 56/56 [00:00<00:00, 77.79it/s]

Figure 2.2 are violin plots illustrating the data distribution by variable. The shape of the violin plots makes the categorical and numerical variables differetiable visually. Categorical variables have the datapoints split to the ends of the violin plot, where as numerical variables are more normally distributed. Based on the distribution, v17 has been falsely assumed to be a numerical variable. It is actually a binary classified categorical variable but does not need to be normalized.

There are certain variables that have different data distributions for the training and validation datasets, for example v8, v12 and v17. This might affect the accuracy of prediction for the validation set.

It is also observed that the training dataset has classLabel imbalance with much more 'yes' than 'no' examples, where as the validation dataset has a more equal distribution of 'yes' and 'no' examples which might also affect the accuracy of the model. This could potentially be offset by undersampling of the examples with 'yes' classLabels or oversampling of the examples with 'no' classLabels during training.

Next, the two parts of the dataset are combined for further visualization.

Part 2.3: Data distribution heatmap for categorical values

In [6]:
cat_col = ['BldgType','HouseStyle','ExterQual','Foundation','BsmtCond','CentralAir','KitchenQual','GarageType','GarageQual','GarageCond','PoolQC','Fence','SaleType']
list_len = len(cat_col)

combo_list = []
fig_cm = make_subplots(rows=list_len,cols=list_len,row_titles=cat_col,column_titles=cat_col)
#Iterating through the variables
for i,j in enumerate(cat_col):
    #Iterating through the variables
    for k,l in enumerate(cat_col):
        values = comb_cat_df.groupby(j).aggregate(l).value_counts(normalize=True).unstack().values
        values = np.round(values,2)
        x_keys = list(comb_cat_df[j].unique())
        if(np.nan in x_keys):
            x_keys.remove(np.nan)  
        y_keys = list(comb_cat_df[l].unique())
        if(np.nan in y_keys):
            y_keys.remove(np.nan)
        fig_cm.append_trace(go.Heatmap(
            x = x_keys,
            y = y_keys,
            z = values,
            type = 'heatmap',
            colorscale = 'Oryel',
            text = values,
            texttemplate = '%{text}',
            hoverongaps = False,
            colorbar = dict(tick0=0.0,dtick=0.1),
            zmin=0.0,
            zmax=1.0
        ),row=i+1,col=k+1)
fig_cm.update_xaxes(tickson='boundaries')
fig_cm.update_yaxes(tickson='boundaries')
fig_cm.layout.update(width=4320,height=1620,title_text='Figure 2.3: Data Distribution Heatmap for Categorical Variables')
fig_cm.show()

Figure 2.3 illustrates the data distribution heatmap for categorical variables. The purpose of plotting the data distribution heatmap for categorical variables is to visually discover data distribution skews in variables with respect to other variables and the classLabel.

For example in the second row of the first column, variable v4 is more likely to be skewed in favor of 'y' for when v1 is 'b'. On the other hand, it is equally distributed for when v1 is 'a', while 'l' is rarely seen. Note that the diagonal consist of exchange matrices (all 0s with 1s along the matrix diagonal).

It is observed that all variables are skewed with respect to classLabel but are balanced with respect to each other with the exception of v12, which is skewed even when compared to every other variable.

Part 2.4: Correlation heatmap for numerical variables

In [7]:
#Get the correlation matrix
corr_matrix = comb_df.corr()
corr_col_list = corr_matrix.columns
for col in corr_col_list:
    corr_matrix[col] = corr_matrix[col].apply(lambda x:'%.2f' %x)
#Create a heatmap based on the matrix
fig_corr = px.imshow(corr_matrix,text_auto=True,color_continuous_scale='oryel')
fig_corr.layout.update(width=2160,height=1620,title_text='Figure 2.4: Correlation Heatmap for Numerical Variables',xaxis_title='Variables',yaxis_title='Variables')
fig_corr.show()

Figure 2.4 illustrates the correlation heatmap for numerical variables. Variables v13 and v15 are almost fully correlated so one of them can be dropped if either is observed to contribute significantly to the inaccuracy of the model. As expected, the diagonal consist of 1s because each variable is fully correlated with itself.

Part 3: Preprocessing - Data imputation, Normalization, Encoding

The categorical variables are imputed with the mode and converted to numerical values using the OrdinalEncoder, while the numerical values are imputed with the k-Nearest Neighbors Imputer and normalized to a value between 0 and 1.

In [8]:
mms = MinMaxScaler()

#Categorical pipeline for preprocessing categorical variables
cat_pipeline = make_pipeline(
    SimpleImputer(strategy='most_frequent'),
    OrdinalEncoder()
)

#Numerical pipeline for preprocessing numerical variables
num_pipeline = make_pipeline(
    KNNImputer(n_neighbors=5),
    mms,
)

#Dropping variable v17 from the numerical dataframe, as it does not need to be preprocessed
#v17_df = comb_num_df['v17']
#num_col.remove('v17')
#comb_num_df = comb_num_df.drop(columns=['v17'])

cat_col = cat_col_bkp
comb_cat_arr = cat_pipeline.fit_transform(comb_cat_df)
comb_cat_df = pd.DataFrame(comb_cat_arr,columns=cat_col)

comb_num_df_bkp = comb_num_df
comb_num_arr = num_pipeline.fit_transform(comb_num_df)
comb_num_df = pd.DataFrame(comb_num_arr,columns=num_col)
no_of_dummy_col = comb_num_df.shape[1] - 1

#Adding variable v17 to the categorical dataframe
#comb_num_df = pd.concat([comb_num_df,v17_df],axis=1)

#Concatenating the processed numerical and categorical dataframes
proc_comb_df = pd.concat([comb_num_df,comb_cat_df,],axis=1)

#The data is saved in the dataframe to be able to perform multiple runs on the data, including dropping variables 
temp_df = proc_comb_df

Part 4: Model Training and Results

Part 4.1: Initial training with all variables present

A Keras sequential model with two layers is used. The first layer consist of a Dense layer with the ReLU (rectified linear unit) activation function. The output layer uses a sigmoid activation function since it is a binary classification task. Binary crossentropy loss is used for training and accuracy is the metric to track.

In [9]:
def uncompiled_seq_model():
    model = tf.keras.models.Sequential([
        tf.keras.layers.Dense(40,activation='relu',kernel_initializer=tf.keras.initializers.random_normal(seed=seed)),
        tf.keras.layers.Dense(10,activation='relu',kernel_initializer=tf.keras.initializers.random_normal(seed=seed)),
        tf.keras.layers.Dense(1,activation='relu',kernel_initializer=tf.keras.initializers.random_normal(seed=seed))
    ])
    return model

def seq_model(lr=0.001):
    model = uncompiled_seq_model()
    model.compile(optimizer=tf.keras.optimizers.Adam(learning_rate=lr),
                    loss='mae',
                    metrics=['mae'])
    
    return model
In [10]:
def model_predict(model,valid_data):
    
    (x_valid,y_valid) = valid_data
    yhat_valid = model.predict(x_valid)
    results = yhat_valid
    
    return results

Next, the combined dataframe is split back to train and valid datasets, respectively. The 'Group' label can be dropped as it does not contain information useful for training the model.

The train data is used for model training and the results checked against the validation dataset.

In [16]:
#Resplit the variable back into training and validation (Group)
proc_valid_df = proc_comb_df[proc_comb_df['Group']==1]
proc_train_df = proc_comb_df[proc_comb_df['Group']==0]

proc_train_y = proc_train_df['SalePrice']
proc_train_x = proc_train_df.drop(columns=['SalePrice'])
proc_train_x = proc_train_df.drop(columns=['Group'])

#proc_valid_y = proc_valid_df['classLabel']
#proc_valid_x = proc_valid_df.drop(columns=['classLabel'])
proc_valid_x = proc_valid_df.drop(columns=['Group'])

#x_valid = proc_valid_x
#y_valid = proc_valid_y
x_train = proc_train_x
y_train = proc_train_y

x_train,x_test,y_train,y_test = train_test_split(x_train,y_train,test_size=0.2)

#Backup for more training iterations

x_test_bkp = x_test
y_test_bkp = y_test
x_train_bkp = x_train
y_train_bkp = y_train

#Measuring the model against validation data
model = seq_model()

checkpoint_filepath = wd + '//checkpoint//model.ckpt'
model_checkpoint_callback = tf.keras.callbacks.ModelCheckpoint(
    filepath=checkpoint_filepath,
    save_weights_only=True,
    monitor='val_mae',
    mode='min',
    save_best_only=True)

model.fit(x_train,y_train,epochs=200,validation_data=(x_test,y_test),verbose=2,callbacks=[model_checkpoint_callback])
test_data = (x_test,y_test)
model.load_weights(checkpoint_filepath)
yhat_test = model_predict(model,test_data)

dummy_arr = np.zeros((np.shape(y_test)[0],no_of_dummy_col))
y_test = np.expand_dims(y_test,axis=1)
y_test = np.concatenate((dummy_arr,y_test),axis=1)
yhat_test = np.concatenate((dummy_arr,yhat_test),axis=1)

#That good line
scaler = MinMaxScaler().fit(comb_num_df_bkp['SalePrice'].to_numpy().reshape(-1,1))

y_test = scaler.inverse_transform(y_test)
yhat_test = scaler.inverse_transform(yhat_test)
print(mean_absolute_error(yhat_test[:,-1],y_test[:,-1]))

#print("Validation accuracy:%.2f" %valid_accuracy)
Epoch 1/200
66/66 - 1s - loss: 0.1324 - mae: 0.1324 - val_loss: 0.0794 - val_mae: 0.0794 - 550ms/epoch - 8ms/step
Epoch 2/200
66/66 - 0s - loss: 0.0653 - mae: 0.0653 - val_loss: 0.0667 - val_mae: 0.0667 - 169ms/epoch - 3ms/step
Epoch 3/200
66/66 - 0s - loss: 0.0543 - mae: 0.0543 - val_loss: 0.0540 - val_mae: 0.0540 - 183ms/epoch - 3ms/step
Epoch 4/200
66/66 - 0s - loss: 0.0473 - mae: 0.0473 - val_loss: 0.0483 - val_mae: 0.0483 - 164ms/epoch - 2ms/step
Epoch 5/200
66/66 - 0s - loss: 0.0435 - mae: 0.0435 - val_loss: 0.0454 - val_mae: 0.0454 - 196ms/epoch - 3ms/step
Epoch 6/200
66/66 - 0s - loss: 0.0416 - mae: 0.0416 - val_loss: 0.0480 - val_mae: 0.0480 - 153ms/epoch - 2ms/step
Epoch 7/200
66/66 - 0s - loss: 0.0407 - mae: 0.0407 - val_loss: 0.0397 - val_mae: 0.0397 - 184ms/epoch - 3ms/step
Epoch 8/200
66/66 - 0s - loss: 0.0351 - mae: 0.0351 - val_loss: 0.0380 - val_mae: 0.0380 - 201ms/epoch - 3ms/step
Epoch 9/200
66/66 - 0s - loss: 0.0335 - mae: 0.0335 - val_loss: 0.0348 - val_mae: 0.0348 - 144ms/epoch - 2ms/step
Epoch 10/200
66/66 - 0s - loss: 0.0316 - mae: 0.0316 - val_loss: 0.0318 - val_mae: 0.0318 - 132ms/epoch - 2ms/step
Epoch 11/200
66/66 - 0s - loss: 0.0309 - mae: 0.0309 - val_loss: 0.0341 - val_mae: 0.0341 - 99ms/epoch - 1ms/step
Epoch 12/200
66/66 - 0s - loss: 0.0305 - mae: 0.0305 - val_loss: 0.0299 - val_mae: 0.0299 - 156ms/epoch - 2ms/step
Epoch 13/200
66/66 - 0s - loss: 0.0279 - mae: 0.0279 - val_loss: 0.0329 - val_mae: 0.0329 - 149ms/epoch - 2ms/step
Epoch 14/200
66/66 - 0s - loss: 0.0276 - mae: 0.0276 - val_loss: 0.0308 - val_mae: 0.0308 - 132ms/epoch - 2ms/step
Epoch 15/200
66/66 - 0s - loss: 0.0248 - mae: 0.0248 - val_loss: 0.0269 - val_mae: 0.0269 - 171ms/epoch - 3ms/step
Epoch 16/200
66/66 - 0s - loss: 0.0247 - mae: 0.0247 - val_loss: 0.0258 - val_mae: 0.0258 - 159ms/epoch - 2ms/step
Epoch 17/200
66/66 - 0s - loss: 0.0273 - mae: 0.0273 - val_loss: 0.0343 - val_mae: 0.0343 - 113ms/epoch - 2ms/step
Epoch 18/200
66/66 - 0s - loss: 0.0257 - mae: 0.0257 - val_loss: 0.0249 - val_mae: 0.0249 - 182ms/epoch - 3ms/step
Epoch 19/200
66/66 - 0s - loss: 0.0246 - mae: 0.0246 - val_loss: 0.0303 - val_mae: 0.0303 - 143ms/epoch - 2ms/step
Epoch 20/200
66/66 - 0s - loss: 0.0233 - mae: 0.0233 - val_loss: 0.0246 - val_mae: 0.0246 - 220ms/epoch - 3ms/step
Epoch 21/200
66/66 - 0s - loss: 0.0231 - mae: 0.0231 - val_loss: 0.0291 - val_mae: 0.0291 - 138ms/epoch - 2ms/step
Epoch 22/200
66/66 - 0s - loss: 0.0229 - mae: 0.0229 - val_loss: 0.0262 - val_mae: 0.0262 - 137ms/epoch - 2ms/step
Epoch 23/200
66/66 - 0s - loss: 0.0208 - mae: 0.0208 - val_loss: 0.0220 - val_mae: 0.0220 - 210ms/epoch - 3ms/step
Epoch 24/200
66/66 - 0s - loss: 0.0215 - mae: 0.0215 - val_loss: 0.0274 - val_mae: 0.0274 - 152ms/epoch - 2ms/step
Epoch 25/200
66/66 - 0s - loss: 0.0209 - mae: 0.0209 - val_loss: 0.0265 - val_mae: 0.0265 - 164ms/epoch - 2ms/step
Epoch 26/200
66/66 - 0s - loss: 0.0226 - mae: 0.0226 - val_loss: 0.0226 - val_mae: 0.0226 - 140ms/epoch - 2ms/step
Epoch 27/200
66/66 - 0s - loss: 0.0219 - mae: 0.0219 - val_loss: 0.0228 - val_mae: 0.0228 - 155ms/epoch - 2ms/step
Epoch 28/200
66/66 - 0s - loss: 0.0197 - mae: 0.0197 - val_loss: 0.0206 - val_mae: 0.0206 - 202ms/epoch - 3ms/step
Epoch 29/200
66/66 - 0s - loss: 0.0189 - mae: 0.0189 - val_loss: 0.0233 - val_mae: 0.0233 - 156ms/epoch - 2ms/step
Epoch 30/200
66/66 - 0s - loss: 0.0202 - mae: 0.0202 - val_loss: 0.0270 - val_mae: 0.0270 - 155ms/epoch - 2ms/step
Epoch 31/200
66/66 - 0s - loss: 0.0208 - mae: 0.0208 - val_loss: 0.0206 - val_mae: 0.0206 - 218ms/epoch - 3ms/step
Epoch 32/200
66/66 - 0s - loss: 0.0176 - mae: 0.0176 - val_loss: 0.0233 - val_mae: 0.0233 - 133ms/epoch - 2ms/step
Epoch 33/200
66/66 - 0s - loss: 0.0225 - mae: 0.0225 - val_loss: 0.0228 - val_mae: 0.0228 - 124ms/epoch - 2ms/step
Epoch 34/200
66/66 - 0s - loss: 0.0195 - mae: 0.0195 - val_loss: 0.0196 - val_mae: 0.0196 - 205ms/epoch - 3ms/step
Epoch 35/200
66/66 - 0s - loss: 0.0178 - mae: 0.0178 - val_loss: 0.0216 - val_mae: 0.0216 - 129ms/epoch - 2ms/step
Epoch 36/200
66/66 - 0s - loss: 0.0176 - mae: 0.0176 - val_loss: 0.0207 - val_mae: 0.0207 - 142ms/epoch - 2ms/step
Epoch 37/200
66/66 - 0s - loss: 0.0173 - mae: 0.0173 - val_loss: 0.0174 - val_mae: 0.0174 - 190ms/epoch - 3ms/step
Epoch 38/200
66/66 - 0s - loss: 0.0192 - mae: 0.0192 - val_loss: 0.0194 - val_mae: 0.0194 - 122ms/epoch - 2ms/step
Epoch 39/200
66/66 - 0s - loss: 0.0173 - mae: 0.0173 - val_loss: 0.0176 - val_mae: 0.0176 - 110ms/epoch - 2ms/step
Epoch 40/200
66/66 - 0s - loss: 0.0168 - mae: 0.0168 - val_loss: 0.0189 - val_mae: 0.0189 - 138ms/epoch - 2ms/step
Epoch 41/200
66/66 - 0s - loss: 0.0188 - mae: 0.0188 - val_loss: 0.0306 - val_mae: 0.0306 - 112ms/epoch - 2ms/step
Epoch 42/200
66/66 - 0s - loss: 0.0184 - mae: 0.0184 - val_loss: 0.0163 - val_mae: 0.0163 - 177ms/epoch - 3ms/step
Epoch 43/200
66/66 - 0s - loss: 0.0159 - mae: 0.0159 - val_loss: 0.0196 - val_mae: 0.0196 - 131ms/epoch - 2ms/step
Epoch 44/200
66/66 - 0s - loss: 0.0198 - mae: 0.0198 - val_loss: 0.0180 - val_mae: 0.0180 - 157ms/epoch - 2ms/step
Epoch 45/200
66/66 - 0s - loss: 0.0160 - mae: 0.0160 - val_loss: 0.0199 - val_mae: 0.0199 - 157ms/epoch - 2ms/step
Epoch 46/200
66/66 - 0s - loss: 0.0155 - mae: 0.0155 - val_loss: 0.0156 - val_mae: 0.0156 - 181ms/epoch - 3ms/step
Epoch 47/200
66/66 - 0s - loss: 0.0157 - mae: 0.0157 - val_loss: 0.0149 - val_mae: 0.0149 - 196ms/epoch - 3ms/step
Epoch 48/200
66/66 - 0s - loss: 0.0151 - mae: 0.0151 - val_loss: 0.0163 - val_mae: 0.0163 - 159ms/epoch - 2ms/step
Epoch 49/200
66/66 - 0s - loss: 0.0164 - mae: 0.0164 - val_loss: 0.0153 - val_mae: 0.0153 - 137ms/epoch - 2ms/step
Epoch 50/200
66/66 - 0s - loss: 0.0185 - mae: 0.0185 - val_loss: 0.0158 - val_mae: 0.0158 - 127ms/epoch - 2ms/step
Epoch 51/200
66/66 - 0s - loss: 0.0166 - mae: 0.0166 - val_loss: 0.0142 - val_mae: 0.0142 - 168ms/epoch - 3ms/step
Epoch 52/200
66/66 - 0s - loss: 0.0145 - mae: 0.0145 - val_loss: 0.0143 - val_mae: 0.0143 - 145ms/epoch - 2ms/step
Epoch 53/200
66/66 - 0s - loss: 0.0142 - mae: 0.0142 - val_loss: 0.0161 - val_mae: 0.0161 - 144ms/epoch - 2ms/step
Epoch 54/200
66/66 - 0s - loss: 0.0159 - mae: 0.0159 - val_loss: 0.0225 - val_mae: 0.0225 - 125ms/epoch - 2ms/step
Epoch 55/200
66/66 - 0s - loss: 0.0153 - mae: 0.0153 - val_loss: 0.0172 - val_mae: 0.0172 - 158ms/epoch - 2ms/step
Epoch 56/200
66/66 - 0s - loss: 0.0138 - mae: 0.0138 - val_loss: 0.0187 - val_mae: 0.0187 - 144ms/epoch - 2ms/step
Epoch 57/200
66/66 - 0s - loss: 0.0135 - mae: 0.0135 - val_loss: 0.0143 - val_mae: 0.0143 - 97ms/epoch - 1ms/step
Epoch 58/200
66/66 - 0s - loss: 0.0146 - mae: 0.0146 - val_loss: 0.0133 - val_mae: 0.0133 - 203ms/epoch - 3ms/step
Epoch 59/200
66/66 - 0s - loss: 0.0132 - mae: 0.0132 - val_loss: 0.0128 - val_mae: 0.0128 - 158ms/epoch - 2ms/step
Epoch 60/200
66/66 - 0s - loss: 0.0160 - mae: 0.0160 - val_loss: 0.0199 - val_mae: 0.0199 - 146ms/epoch - 2ms/step
Epoch 61/200
66/66 - 0s - loss: 0.0126 - mae: 0.0126 - val_loss: 0.0154 - val_mae: 0.0154 - 154ms/epoch - 2ms/step
Epoch 62/200
66/66 - 0s - loss: 0.0137 - mae: 0.0137 - val_loss: 0.0125 - val_mae: 0.0125 - 163ms/epoch - 2ms/step
Epoch 63/200
66/66 - 0s - loss: 0.0145 - mae: 0.0145 - val_loss: 0.0122 - val_mae: 0.0122 - 154ms/epoch - 2ms/step
Epoch 64/200
66/66 - 0s - loss: 0.0129 - mae: 0.0129 - val_loss: 0.0115 - val_mae: 0.0115 - 155ms/epoch - 2ms/step
Epoch 65/200
66/66 - 0s - loss: 0.0120 - mae: 0.0120 - val_loss: 0.0124 - val_mae: 0.0124 - 108ms/epoch - 2ms/step
Epoch 66/200
66/66 - 0s - loss: 0.0133 - mae: 0.0133 - val_loss: 0.0131 - val_mae: 0.0131 - 132ms/epoch - 2ms/step
Epoch 67/200
66/66 - 0s - loss: 0.0124 - mae: 0.0124 - val_loss: 0.0131 - val_mae: 0.0131 - 110ms/epoch - 2ms/step
Epoch 68/200
66/66 - 0s - loss: 0.0122 - mae: 0.0122 - val_loss: 0.0113 - val_mae: 0.0113 - 132ms/epoch - 2ms/step
Epoch 69/200
66/66 - 0s - loss: 0.0121 - mae: 0.0121 - val_loss: 0.0135 - val_mae: 0.0135 - 159ms/epoch - 2ms/step
Epoch 70/200
66/66 - 0s - loss: 0.0131 - mae: 0.0131 - val_loss: 0.0154 - val_mae: 0.0154 - 134ms/epoch - 2ms/step
Epoch 71/200
66/66 - 0s - loss: 0.0143 - mae: 0.0143 - val_loss: 0.0108 - val_mae: 0.0108 - 136ms/epoch - 2ms/step
Epoch 72/200
66/66 - 0s - loss: 0.0127 - mae: 0.0127 - val_loss: 0.0112 - val_mae: 0.0112 - 166ms/epoch - 3ms/step
Epoch 73/200
66/66 - 0s - loss: 0.0134 - mae: 0.0134 - val_loss: 0.0103 - val_mae: 0.0103 - 209ms/epoch - 3ms/step
Epoch 74/200
66/66 - 0s - loss: 0.0139 - mae: 0.0139 - val_loss: 0.0197 - val_mae: 0.0197 - 148ms/epoch - 2ms/step
Epoch 75/200
66/66 - 0s - loss: 0.0141 - mae: 0.0141 - val_loss: 0.0115 - val_mae: 0.0115 - 96ms/epoch - 1ms/step
Epoch 76/200
66/66 - 0s - loss: 0.0131 - mae: 0.0131 - val_loss: 0.0124 - val_mae: 0.0124 - 100ms/epoch - 2ms/step
Epoch 77/200
66/66 - 0s - loss: 0.0126 - mae: 0.0126 - val_loss: 0.0183 - val_mae: 0.0183 - 76ms/epoch - 1ms/step
Epoch 78/200
66/66 - 0s - loss: 0.0100 - mae: 0.0100 - val_loss: 0.0128 - val_mae: 0.0128 - 127ms/epoch - 2ms/step
Epoch 79/200
66/66 - 0s - loss: 0.0109 - mae: 0.0109 - val_loss: 0.0147 - val_mae: 0.0147 - 121ms/epoch - 2ms/step
Epoch 80/200
66/66 - 0s - loss: 0.0108 - mae: 0.0108 - val_loss: 0.0106 - val_mae: 0.0106 - 155ms/epoch - 2ms/step
Epoch 81/200
66/66 - 0s - loss: 0.0102 - mae: 0.0102 - val_loss: 0.0128 - val_mae: 0.0128 - 148ms/epoch - 2ms/step
Epoch 82/200
66/66 - 0s - loss: 0.0112 - mae: 0.0112 - val_loss: 0.0099 - val_mae: 0.0099 - 218ms/epoch - 3ms/step
Epoch 83/200
66/66 - 0s - loss: 0.0114 - mae: 0.0114 - val_loss: 0.0113 - val_mae: 0.0113 - 117ms/epoch - 2ms/step
Epoch 84/200
66/66 - 0s - loss: 0.0144 - mae: 0.0144 - val_loss: 0.0147 - val_mae: 0.0147 - 97ms/epoch - 1ms/step
Epoch 85/200
66/66 - 0s - loss: 0.0108 - mae: 0.0108 - val_loss: 0.0095 - val_mae: 0.0095 - 200ms/epoch - 3ms/step
Epoch 86/200
66/66 - 0s - loss: 0.0137 - mae: 0.0137 - val_loss: 0.0262 - val_mae: 0.0262 - 135ms/epoch - 2ms/step
Epoch 87/200
66/66 - 0s - loss: 0.0126 - mae: 0.0126 - val_loss: 0.0093 - val_mae: 0.0093 - 180ms/epoch - 3ms/step
Epoch 88/200
66/66 - 0s - loss: 0.0100 - mae: 0.0100 - val_loss: 0.0113 - val_mae: 0.0113 - 151ms/epoch - 2ms/step
Epoch 89/200
66/66 - 0s - loss: 0.0100 - mae: 0.0100 - val_loss: 0.0092 - val_mae: 0.0092 - 166ms/epoch - 3ms/step
Epoch 90/200
66/66 - 0s - loss: 0.0102 - mae: 0.0102 - val_loss: 0.0089 - val_mae: 0.0089 - 159ms/epoch - 2ms/step
Epoch 91/200
66/66 - 0s - loss: 0.0099 - mae: 0.0099 - val_loss: 0.0088 - val_mae: 0.0088 - 174ms/epoch - 3ms/step
Epoch 92/200
66/66 - 0s - loss: 0.0093 - mae: 0.0093 - val_loss: 0.0092 - val_mae: 0.0092 - 161ms/epoch - 2ms/step
Epoch 93/200
66/66 - 0s - loss: 0.0120 - mae: 0.0120 - val_loss: 0.0094 - val_mae: 0.0094 - 128ms/epoch - 2ms/step
Epoch 94/200
66/66 - 0s - loss: 0.0115 - mae: 0.0115 - val_loss: 0.0118 - val_mae: 0.0118 - 96ms/epoch - 1ms/step
Epoch 95/200
66/66 - 0s - loss: 0.0094 - mae: 0.0094 - val_loss: 0.0164 - val_mae: 0.0164 - 115ms/epoch - 2ms/step
Epoch 96/200
66/66 - 0s - loss: 0.0101 - mae: 0.0101 - val_loss: 0.0113 - val_mae: 0.0113 - 149ms/epoch - 2ms/step
Epoch 97/200
66/66 - 0s - loss: 0.0105 - mae: 0.0105 - val_loss: 0.0148 - val_mae: 0.0148 - 137ms/epoch - 2ms/step
Epoch 98/200
66/66 - 0s - loss: 0.0117 - mae: 0.0117 - val_loss: 0.0080 - val_mae: 0.0080 - 206ms/epoch - 3ms/step
Epoch 99/200
66/66 - 0s - loss: 0.0134 - mae: 0.0134 - val_loss: 0.0084 - val_mae: 0.0084 - 162ms/epoch - 2ms/step
Epoch 100/200
66/66 - 0s - loss: 0.0105 - mae: 0.0105 - val_loss: 0.0182 - val_mae: 0.0182 - 129ms/epoch - 2ms/step
Epoch 101/200
66/66 - 0s - loss: 0.0115 - mae: 0.0115 - val_loss: 0.0181 - val_mae: 0.0181 - 145ms/epoch - 2ms/step
Epoch 102/200
66/66 - 0s - loss: 0.0116 - mae: 0.0116 - val_loss: 0.0188 - val_mae: 0.0188 - 154ms/epoch - 2ms/step
Epoch 103/200
66/66 - 0s - loss: 0.0103 - mae: 0.0103 - val_loss: 0.0111 - val_mae: 0.0111 - 144ms/epoch - 2ms/step
Epoch 104/200
66/66 - 0s - loss: 0.0120 - mae: 0.0120 - val_loss: 0.0089 - val_mae: 0.0089 - 109ms/epoch - 2ms/step
Epoch 105/200
66/66 - 0s - loss: 0.0092 - mae: 0.0092 - val_loss: 0.0075 - val_mae: 0.0075 - 189ms/epoch - 3ms/step
Epoch 106/200
66/66 - 0s - loss: 0.0083 - mae: 0.0083 - val_loss: 0.0074 - val_mae: 0.0074 - 186ms/epoch - 3ms/step
Epoch 107/200
66/66 - 0s - loss: 0.0087 - mae: 0.0087 - val_loss: 0.0142 - val_mae: 0.0142 - 143ms/epoch - 2ms/step
Epoch 108/200
66/66 - 0s - loss: 0.0091 - mae: 0.0091 - val_loss: 0.0069 - val_mae: 0.0069 - 189ms/epoch - 3ms/step
Epoch 109/200
66/66 - 0s - loss: 0.0079 - mae: 0.0079 - val_loss: 0.0105 - val_mae: 0.0105 - 121ms/epoch - 2ms/step
Epoch 110/200
66/66 - 0s - loss: 0.0101 - mae: 0.0101 - val_loss: 0.0090 - val_mae: 0.0090 - 94ms/epoch - 1ms/step
Epoch 111/200
66/66 - 0s - loss: 0.0077 - mae: 0.0077 - val_loss: 0.0069 - val_mae: 0.0069 - 119ms/epoch - 2ms/step
Epoch 112/200
66/66 - 0s - loss: 0.0087 - mae: 0.0087 - val_loss: 0.0140 - val_mae: 0.0140 - 148ms/epoch - 2ms/step
Epoch 113/200
66/66 - 0s - loss: 0.0096 - mae: 0.0096 - val_loss: 0.0068 - val_mae: 0.0068 - 172ms/epoch - 3ms/step
Epoch 114/200
66/66 - 0s - loss: 0.0089 - mae: 0.0089 - val_loss: 0.0103 - val_mae: 0.0103 - 157ms/epoch - 2ms/step
Epoch 115/200
66/66 - 0s - loss: 0.0096 - mae: 0.0096 - val_loss: 0.0107 - val_mae: 0.0107 - 155ms/epoch - 2ms/step
Epoch 116/200
66/66 - 0s - loss: 0.0090 - mae: 0.0090 - val_loss: 0.0113 - val_mae: 0.0113 - 139ms/epoch - 2ms/step
Epoch 117/200
66/66 - 0s - loss: 0.0079 - mae: 0.0079 - val_loss: 0.0079 - val_mae: 0.0079 - 87ms/epoch - 1ms/step
Epoch 118/200
66/66 - 0s - loss: 0.0112 - mae: 0.0112 - val_loss: 0.0072 - val_mae: 0.0072 - 90ms/epoch - 1ms/step
Epoch 119/200
66/66 - 0s - loss: 0.0081 - mae: 0.0081 - val_loss: 0.0092 - val_mae: 0.0092 - 150ms/epoch - 2ms/step
Epoch 120/200
66/66 - 0s - loss: 0.0095 - mae: 0.0095 - val_loss: 0.0092 - val_mae: 0.0092 - 156ms/epoch - 2ms/step
Epoch 121/200
66/66 - 0s - loss: 0.0074 - mae: 0.0074 - val_loss: 0.0064 - val_mae: 0.0064 - 187ms/epoch - 3ms/step
Epoch 122/200
66/66 - 0s - loss: 0.0091 - mae: 0.0091 - val_loss: 0.0081 - val_mae: 0.0081 - 149ms/epoch - 2ms/step
Epoch 123/200
66/66 - 0s - loss: 0.0147 - mae: 0.0147 - val_loss: 0.0092 - val_mae: 0.0092 - 162ms/epoch - 2ms/step
Epoch 124/200
66/66 - 0s - loss: 0.0076 - mae: 0.0076 - val_loss: 0.0072 - val_mae: 0.0072 - 158ms/epoch - 2ms/step
Epoch 125/200
66/66 - 0s - loss: 0.0076 - mae: 0.0076 - val_loss: 0.0084 - val_mae: 0.0084 - 148ms/epoch - 2ms/step
Epoch 126/200
66/66 - 0s - loss: 0.0141 - mae: 0.0141 - val_loss: 0.0140 - val_mae: 0.0140 - 109ms/epoch - 2ms/step
Epoch 127/200
66/66 - 0s - loss: 0.0117 - mae: 0.0117 - val_loss: 0.0114 - val_mae: 0.0114 - 121ms/epoch - 2ms/step
Epoch 128/200
66/66 - 0s - loss: 0.0076 - mae: 0.0076 - val_loss: 0.0094 - val_mae: 0.0094 - 132ms/epoch - 2ms/step
Epoch 129/200
66/66 - 0s - loss: 0.0078 - mae: 0.0078 - val_loss: 0.0184 - val_mae: 0.0184 - 141ms/epoch - 2ms/step
Epoch 130/200
66/66 - 0s - loss: 0.0086 - mae: 0.0086 - val_loss: 0.0066 - val_mae: 0.0066 - 128ms/epoch - 2ms/step
Epoch 131/200
66/66 - 0s - loss: 0.0110 - mae: 0.0110 - val_loss: 0.0093 - val_mae: 0.0093 - 93ms/epoch - 1ms/step
Epoch 132/200
66/66 - 0s - loss: 0.0141 - mae: 0.0141 - val_loss: 0.0094 - val_mae: 0.0094 - 124ms/epoch - 2ms/step
Epoch 133/200
66/66 - 0s - loss: 0.0085 - mae: 0.0085 - val_loss: 0.0078 - val_mae: 0.0078 - 161ms/epoch - 2ms/step
Epoch 134/200
66/66 - 0s - loss: 0.0065 - mae: 0.0065 - val_loss: 0.0109 - val_mae: 0.0109 - 140ms/epoch - 2ms/step
Epoch 135/200
66/66 - 0s - loss: 0.0108 - mae: 0.0108 - val_loss: 0.0066 - val_mae: 0.0066 - 99ms/epoch - 1ms/step
Epoch 136/200
66/66 - 0s - loss: 0.0072 - mae: 0.0072 - val_loss: 0.0061 - val_mae: 0.0061 - 149ms/epoch - 2ms/step
Epoch 137/200
66/66 - 0s - loss: 0.0077 - mae: 0.0077 - val_loss: 0.0077 - val_mae: 0.0077 - 167ms/epoch - 3ms/step
Epoch 138/200
66/66 - 0s - loss: 0.0085 - mae: 0.0085 - val_loss: 0.0087 - val_mae: 0.0087 - 159ms/epoch - 2ms/step
Epoch 139/200
66/66 - 0s - loss: 0.0081 - mae: 0.0081 - val_loss: 0.0121 - val_mae: 0.0121 - 111ms/epoch - 2ms/step
Epoch 140/200
66/66 - 0s - loss: 0.0072 - mae: 0.0072 - val_loss: 0.0056 - val_mae: 0.0056 - 164ms/epoch - 2ms/step
Epoch 141/200
66/66 - 0s - loss: 0.0085 - mae: 0.0085 - val_loss: 0.0076 - val_mae: 0.0076 - 140ms/epoch - 2ms/step
Epoch 142/200
66/66 - 0s - loss: 0.0088 - mae: 0.0088 - val_loss: 0.0061 - val_mae: 0.0061 - 138ms/epoch - 2ms/step
Epoch 143/200
66/66 - 0s - loss: 0.0124 - mae: 0.0124 - val_loss: 0.0075 - val_mae: 0.0075 - 118ms/epoch - 2ms/step
Epoch 144/200
66/66 - 0s - loss: 0.0065 - mae: 0.0065 - val_loss: 0.0077 - val_mae: 0.0077 - 114ms/epoch - 2ms/step
Epoch 145/200
66/66 - 0s - loss: 0.0074 - mae: 0.0074 - val_loss: 0.0107 - val_mae: 0.0107 - 153ms/epoch - 2ms/step
Epoch 146/200
66/66 - 0s - loss: 0.0097 - mae: 0.0097 - val_loss: 0.0137 - val_mae: 0.0137 - 94ms/epoch - 1ms/step
Epoch 147/200
66/66 - 0s - loss: 0.0089 - mae: 0.0089 - val_loss: 0.0076 - val_mae: 0.0076 - 149ms/epoch - 2ms/step
Epoch 148/200
66/66 - 0s - loss: 0.0079 - mae: 0.0079 - val_loss: 0.0056 - val_mae: 0.0056 - 119ms/epoch - 2ms/step
Epoch 149/200
66/66 - 0s - loss: 0.0062 - mae: 0.0062 - val_loss: 0.0084 - val_mae: 0.0084 - 135ms/epoch - 2ms/step
Epoch 150/200
66/66 - 0s - loss: 0.0076 - mae: 0.0076 - val_loss: 0.0081 - val_mae: 0.0081 - 137ms/epoch - 2ms/step
Epoch 151/200
66/66 - 0s - loss: 0.0072 - mae: 0.0072 - val_loss: 0.0070 - val_mae: 0.0070 - 147ms/epoch - 2ms/step
Epoch 152/200
66/66 - 0s - loss: 0.0069 - mae: 0.0069 - val_loss: 0.0055 - val_mae: 0.0055 - 147ms/epoch - 2ms/step
Epoch 153/200
66/66 - 0s - loss: 0.0069 - mae: 0.0069 - val_loss: 0.0141 - val_mae: 0.0141 - 112ms/epoch - 2ms/step
Epoch 154/200
66/66 - 0s - loss: 0.0081 - mae: 0.0081 - val_loss: 0.0082 - val_mae: 0.0082 - 162ms/epoch - 2ms/step
Epoch 155/200
66/66 - 0s - loss: 0.0084 - mae: 0.0084 - val_loss: 0.0106 - val_mae: 0.0106 - 155ms/epoch - 2ms/step
Epoch 156/200
66/66 - 0s - loss: 0.0071 - mae: 0.0071 - val_loss: 0.0084 - val_mae: 0.0084 - 154ms/epoch - 2ms/step
Epoch 157/200
66/66 - 0s - loss: 0.0081 - mae: 0.0081 - val_loss: 0.0081 - val_mae: 0.0081 - 151ms/epoch - 2ms/step
Epoch 158/200
66/66 - 0s - loss: 0.0124 - mae: 0.0124 - val_loss: 0.0209 - val_mae: 0.0209 - 138ms/epoch - 2ms/step
Epoch 159/200
66/66 - 0s - loss: 0.0078 - mae: 0.0078 - val_loss: 0.0054 - val_mae: 0.0054 - 146ms/epoch - 2ms/step
Epoch 160/200
66/66 - 0s - loss: 0.0069 - mae: 0.0069 - val_loss: 0.0052 - val_mae: 0.0052 - 162ms/epoch - 2ms/step
Epoch 161/200
66/66 - 0s - loss: 0.0102 - mae: 0.0102 - val_loss: 0.0052 - val_mae: 0.0052 - 158ms/epoch - 2ms/step
Epoch 162/200
66/66 - 0s - loss: 0.0081 - mae: 0.0081 - val_loss: 0.0082 - val_mae: 0.0082 - 115ms/epoch - 2ms/step
Epoch 163/200
66/66 - 0s - loss: 0.0060 - mae: 0.0060 - val_loss: 0.0046 - val_mae: 0.0046 - 168ms/epoch - 3ms/step
Epoch 164/200
66/66 - 0s - loss: 0.0083 - mae: 0.0083 - val_loss: 0.0055 - val_mae: 0.0055 - 162ms/epoch - 2ms/step
Epoch 165/200
66/66 - 0s - loss: 0.0100 - mae: 0.0100 - val_loss: 0.0183 - val_mae: 0.0183 - 125ms/epoch - 2ms/step
Epoch 166/200
66/66 - 0s - loss: 0.0129 - mae: 0.0129 - val_loss: 0.0064 - val_mae: 0.0064 - 123ms/epoch - 2ms/step
Epoch 167/200
66/66 - 0s - loss: 0.0085 - mae: 0.0085 - val_loss: 0.0100 - val_mae: 0.0100 - 134ms/epoch - 2ms/step
Epoch 168/200
66/66 - 0s - loss: 0.0084 - mae: 0.0084 - val_loss: 0.0063 - val_mae: 0.0063 - 124ms/epoch - 2ms/step
Epoch 169/200
66/66 - 0s - loss: 0.0107 - mae: 0.0107 - val_loss: 0.0077 - val_mae: 0.0077 - 97ms/epoch - 1ms/step
Epoch 170/200
66/66 - 0s - loss: 0.0073 - mae: 0.0073 - val_loss: 0.0049 - val_mae: 0.0049 - 100ms/epoch - 2ms/step
Epoch 171/200
66/66 - 0s - loss: 0.0073 - mae: 0.0073 - val_loss: 0.0061 - val_mae: 0.0061 - 109ms/epoch - 2ms/step
Epoch 172/200
66/66 - 0s - loss: 0.0070 - mae: 0.0070 - val_loss: 0.0087 - val_mae: 0.0087 - 115ms/epoch - 2ms/step
Epoch 173/200
66/66 - 0s - loss: 0.0064 - mae: 0.0064 - val_loss: 0.0137 - val_mae: 0.0137 - 85ms/epoch - 1ms/step
Epoch 174/200
66/66 - 0s - loss: 0.0070 - mae: 0.0070 - val_loss: 0.0089 - val_mae: 0.0089 - 91ms/epoch - 1ms/step
Epoch 175/200
66/66 - 0s - loss: 0.0067 - mae: 0.0067 - val_loss: 0.0054 - val_mae: 0.0054 - 99ms/epoch - 2ms/step
Epoch 176/200
66/66 - 0s - loss: 0.0096 - mae: 0.0096 - val_loss: 0.0111 - val_mae: 0.0111 - 143ms/epoch - 2ms/step
Epoch 177/200
66/66 - 0s - loss: 0.0060 - mae: 0.0060 - val_loss: 0.0047 - val_mae: 0.0047 - 104ms/epoch - 2ms/step
Epoch 178/200
66/66 - 0s - loss: 0.0067 - mae: 0.0067 - val_loss: 0.0051 - val_mae: 0.0051 - 115ms/epoch - 2ms/step
Epoch 179/200
66/66 - 0s - loss: 0.0072 - mae: 0.0072 - val_loss: 0.0050 - val_mae: 0.0050 - 143ms/epoch - 2ms/step
Epoch 180/200
66/66 - 0s - loss: 0.0080 - mae: 0.0080 - val_loss: 0.0222 - val_mae: 0.0222 - 113ms/epoch - 2ms/step
Epoch 181/200
66/66 - 0s - loss: 0.0099 - mae: 0.0099 - val_loss: 0.0050 - val_mae: 0.0050 - 118ms/epoch - 2ms/step
Epoch 182/200
66/66 - 0s - loss: 0.0085 - mae: 0.0085 - val_loss: 0.0056 - val_mae: 0.0056 - 88ms/epoch - 1ms/step
Epoch 183/200
66/66 - 0s - loss: 0.0066 - mae: 0.0066 - val_loss: 0.0055 - val_mae: 0.0055 - 144ms/epoch - 2ms/step
Epoch 184/200
66/66 - 0s - loss: 0.0072 - mae: 0.0072 - val_loss: 0.0088 - val_mae: 0.0088 - 97ms/epoch - 1ms/step
Epoch 185/200
66/66 - 0s - loss: 0.0069 - mae: 0.0069 - val_loss: 0.0080 - val_mae: 0.0080 - 92ms/epoch - 1ms/step
Epoch 186/200
66/66 - 0s - loss: 0.0063 - mae: 0.0063 - val_loss: 0.0059 - val_mae: 0.0059 - 105ms/epoch - 2ms/step
Epoch 187/200
66/66 - 0s - loss: 0.0071 - mae: 0.0071 - val_loss: 0.0049 - val_mae: 0.0049 - 150ms/epoch - 2ms/step
Epoch 188/200
66/66 - 0s - loss: 0.0069 - mae: 0.0069 - val_loss: 0.0051 - val_mae: 0.0051 - 105ms/epoch - 2ms/step
Epoch 189/200
66/66 - 0s - loss: 0.0077 - mae: 0.0077 - val_loss: 0.0092 - val_mae: 0.0092 - 100ms/epoch - 2ms/step
Epoch 190/200
66/66 - 0s - loss: 0.0077 - mae: 0.0077 - val_loss: 0.0094 - val_mae: 0.0094 - 118ms/epoch - 2ms/step
Epoch 191/200
66/66 - 0s - loss: 0.0071 - mae: 0.0071 - val_loss: 0.0045 - val_mae: 0.0045 - 207ms/epoch - 3ms/step
Epoch 192/200
66/66 - 0s - loss: 0.0065 - mae: 0.0065 - val_loss: 0.0074 - val_mae: 0.0074 - 150ms/epoch - 2ms/step
Epoch 193/200
66/66 - 0s - loss: 0.0053 - mae: 0.0053 - val_loss: 0.0045 - val_mae: 0.0045 - 207ms/epoch - 3ms/step
Epoch 194/200
66/66 - 0s - loss: 0.0051 - mae: 0.0051 - val_loss: 0.0049 - val_mae: 0.0049 - 143ms/epoch - 2ms/step
Epoch 195/200
66/66 - 0s - loss: 0.0069 - mae: 0.0069 - val_loss: 0.0057 - val_mae: 0.0057 - 133ms/epoch - 2ms/step
Epoch 196/200
66/66 - 0s - loss: 0.0097 - mae: 0.0097 - val_loss: 0.0084 - val_mae: 0.0084 - 97ms/epoch - 1ms/step
Epoch 197/200
66/66 - 0s - loss: 0.0059 - mae: 0.0059 - val_loss: 0.0049 - val_mae: 0.0049 - 124ms/epoch - 2ms/step
Epoch 198/200
66/66 - 0s - loss: 0.0060 - mae: 0.0060 - val_loss: 0.0082 - val_mae: 0.0082 - 118ms/epoch - 2ms/step
Epoch 199/200
66/66 - 0s - loss: 0.0111 - mae: 0.0111 - val_loss: 0.0128 - val_mae: 0.0128 - 114ms/epoch - 2ms/step
Epoch 200/200
66/66 - 0s - loss: 0.0085 - mae: 0.0085 - val_loss: 0.0065 - val_mae: 0.0065 - 97ms/epoch - 1ms/step
17/17 [==============================] - 0s 870us/step
3217.048216442718

Part 4.2: Iteratively dropping variables to check the effects on the model accuracy

In [40]:
#column_list = ['v1','v2','v3','v4','v5','v6','v7','v8','v9','v10','v11','v12','v13','v14','v15','v16','v17',]
column_list = num_col + cat_col
column_list.remove('SalePrice')
column_list.remove('Group')

var_list = []
accuracy_list = []

for col in column_list:
    model = seq_model()
    
    x_train = x_train_bkp
    y_train = y_train_bkp
    x_test = x_test_bkp
    y_test = y_test_bkp
    
    x_train = x_train.drop(columns=[col])
    x_test = x_test.drop(columns=[col])
    
    print("\nEvaluation of model without variable %s" %col)
    model.fit(x_train,y_train,epochs=10,validation_data=(x_test,y_test),verbose=0)
    test_data = (x_test,y_test)
    yhat_test = model_predict(model,test_data)
    test_mae = float(mean_absolute_error(yhat_test,y_test))
    var_list.append(col)
    accuracy_list.append(test_mae)
    print("MAE: %.4f" %test_mae)

var_df = pd.DataFrame(var_list,columns=['var'])
accuracy_df = pd.DataFrame(accuracy_list,columns=['accuracy'])
accuracy_df = pd.concat([var_df,accuracy_df],axis=1)
accuracy_df = accuracy_df.sort_values('accuracy',ascending=False)
Evaluation of model without variable Id
17/17 [==============================] - 0s 1ms/step
MAE: 0.0320

Evaluation of model without variable MSSubClass
17/17 [==============================] - 0s 1ms/step
MAE: 0.0334

Evaluation of model without variable LotFrontage
17/17 [==============================] - 0s 1ms/step
MAE: 0.3786

Evaluation of model without variable LotArea
17/17 [==============================] - 0s 1ms/step
MAE: 0.3786

Evaluation of model without variable OverallQual
17/17 [==============================] - 0s 1ms/step
MAE: 0.0340

Evaluation of model without variable OverallCond
17/17 [==============================] - 0s 1ms/step
MAE: 0.0345

Evaluation of model without variable YearBuilt
17/17 [==============================] - 0s 1ms/step
MAE: 0.0333

Evaluation of model without variable YearRemodAdd
17/17 [==============================] - 0s 1ms/step
MAE: 0.0329

Evaluation of model without variable MasVnrArea
17/17 [==============================] - 0s 1ms/step
MAE: 0.0380

Evaluation of model without variable BsmtFinSF1
17/17 [==============================] - 0s 1ms/step
MAE: 0.0395

Evaluation of model without variable BsmtFinSF2
17/17 [==============================] - 0s 1ms/step
MAE: 0.0328

Evaluation of model without variable BsmtUnfSF
17/17 [==============================] - 0s 1ms/step
MAE: 0.0326

Evaluation of model without variable TotalBsmtSF
17/17 [==============================] - 0s 1ms/step
MAE: 0.0359

Evaluation of model without variable 1stFlrSF
17/17 [==============================] - 0s 1ms/step
MAE: 0.0317

Evaluation of model without variable 2ndFlrSF
17/17 [==============================] - 0s 1ms/step
MAE: 0.3786

Evaluation of model without variable LowQualFinSF
17/17 [==============================] - 0s 1ms/step
MAE: 0.0315

Evaluation of model without variable GrLivArea
17/17 [==============================] - 0s 1ms/step
MAE: 0.0337

Evaluation of model without variable BsmtFullBath
17/17 [==============================] - 0s 1ms/step
MAE: 0.0376

Evaluation of model without variable BsmtHalfBath
17/17 [==============================] - 0s 1ms/step
MAE: 0.0328

Evaluation of model without variable FullBath
17/17 [==============================] - 0s 1ms/step
MAE: 0.0319

Evaluation of model without variable HalfBath
17/17 [==============================] - 0s 1ms/step
MAE: 0.0334

Evaluation of model without variable BedroomAbvGr
17/17 [==============================] - 0s 1ms/step
MAE: 0.0334

Evaluation of model without variable KitchenAbvGr
17/17 [==============================] - 0s 1ms/step
MAE: 0.0328

Evaluation of model without variable TotRmsAbvGrd
17/17 [==============================] - 0s 1ms/step
MAE: 0.0350

Evaluation of model without variable Fireplaces
17/17 [==============================] - 0s 1ms/step
MAE: 0.0355

Evaluation of model without variable GarageYrBlt
17/17 [==============================] - 0s 1ms/step
MAE: 0.0316

Evaluation of model without variable GarageCars
17/17 [==============================] - 0s 1ms/step
MAE: 0.0335

Evaluation of model without variable GarageArea
17/17 [==============================] - 0s 1ms/step
MAE: 0.0346

Evaluation of model without variable WoodDeckSF
17/17 [==============================] - 0s 1ms/step
MAE: 0.0350

Evaluation of model without variable OpenPorchSF
17/17 [==============================] - 0s 938us/step
MAE: 0.0327

Evaluation of model without variable EnclosedPorch
17/17 [==============================] - 0s 1ms/step
MAE: 0.0334

Evaluation of model without variable 3SsnPorch
17/17 [==============================] - 0s 1ms/step
MAE: 0.3786

Evaluation of model without variable ScreenPorch
17/17 [==============================] - 0s 1ms/step
MAE: 0.0359

Evaluation of model without variable PoolArea
17/17 [==============================] - 0s 1ms/step
MAE: 0.0310

Evaluation of model without variable MiscVal
17/17 [==============================] - 0s 1ms/step
MAE: 0.0341

Evaluation of model without variable MoSold
17/17 [==============================] - 0s 1ms/step
MAE: 0.3786

Evaluation of model without variable YrSold
17/17 [==============================] - 0s 1ms/step
MAE: 0.0346

Evaluation of model without variable MSZoning
17/17 [==============================] - 0s 1ms/step
MAE: 0.0301

Evaluation of model without variable Street
17/17 [==============================] - 0s 1ms/step
MAE: 0.0345

Evaluation of model without variable Alley
17/17 [==============================] - 0s 1ms/step
MAE: 0.0311

Evaluation of model without variable LotShape
17/17 [==============================] - 0s 1ms/step
MAE: 0.0344

Evaluation of model without variable LandContour
17/17 [==============================] - 0s 1ms/step
MAE: 0.3786

Evaluation of model without variable Utilities
17/17 [==============================] - 0s 1ms/step
MAE: 0.3786

Evaluation of model without variable LotConfig
17/17 [==============================] - 0s 1ms/step
MAE: 0.3786

Evaluation of model without variable LandSlope
17/17 [==============================] - 0s 1ms/step
MAE: 0.0356

Evaluation of model without variable Neighborhood
17/17 [==============================] - 0s 875us/step
MAE: 0.0321

Evaluation of model without variable Condition1
17/17 [==============================] - 0s 1ms/step
MAE: 0.0343

Evaluation of model without variable Condition2
17/17 [==============================] - 0s 1ms/step
MAE: 0.0331

Evaluation of model without variable BldgType
17/17 [==============================] - 0s 1ms/step
MAE: 0.0353

Evaluation of model without variable HouseStyle
17/17 [==============================] - 0s 937us/step
MAE: 0.0331

Evaluation of model without variable RoofStyle
17/17 [==============================] - 0s 978us/step
MAE: 0.3786

Evaluation of model without variable RoofMatl
17/17 [==============================] - 0s 1ms/step
MAE: 0.3786

Evaluation of model without variable Exterior1st
17/17 [==============================] - 0s 1ms/step
MAE: 0.3786

Evaluation of model without variable Exterior2nd
17/17 [==============================] - 0s 1ms/step
MAE: 0.0326

Evaluation of model without variable MasVnrType
17/17 [==============================] - 0s 1ms/step
MAE: 0.0334

Evaluation of model without variable ExterQual
17/17 [==============================] - 0s 1ms/step
MAE: 0.0294

Evaluation of model without variable ExterCond
17/17 [==============================] - 0s 1ms/step
MAE: 0.0350

Evaluation of model without variable Foundation
17/17 [==============================] - 0s 1ms/step
MAE: 0.0341

Evaluation of model without variable BsmtQual
17/17 [==============================] - 0s 1ms/step
MAE: 0.3786

Evaluation of model without variable BsmtCond
17/17 [==============================] - 0s 1ms/step
MAE: 0.0346

Evaluation of model without variable BsmtExposure
17/17 [==============================] - 0s 1ms/step
MAE: 0.0334

Evaluation of model without variable BsmtFinType1
17/17 [==============================] - 0s 882us/step
MAE: 0.0329

Evaluation of model without variable BsmtFinType2
17/17 [==============================] - 0s 1ms/step
MAE: 0.0334

Evaluation of model without variable Heating
17/17 [==============================] - 0s 1ms/step
MAE: 0.0328

Evaluation of model without variable HeatingQC
17/17 [==============================] - 0s 1ms/step
MAE: 0.3786

Evaluation of model without variable CentralAir
17/17 [==============================] - 0s 1ms/step
MAE: 0.0338

Evaluation of model without variable Electrical
17/17 [==============================] - 0s 1ms/step
MAE: 0.0321

Evaluation of model without variable KitchenQual
17/17 [==============================] - 0s 1ms/step
MAE: 0.0304

Evaluation of model without variable Functional
17/17 [==============================] - 0s 1ms/step
MAE: 0.0329

Evaluation of model without variable FireplaceQu
17/17 [==============================] - 0s 1ms/step
MAE: 0.0356

Evaluation of model without variable GarageType
17/17 [==============================] - 0s 1ms/step
MAE: 0.0374

Evaluation of model without variable GarageFinish
17/17 [==============================] - 0s 1ms/step
MAE: 0.0310

Evaluation of model without variable GarageQual
17/17 [==============================] - 0s 1ms/step
MAE: 0.3786

Evaluation of model without variable GarageCond
17/17 [==============================] - 0s 1ms/step
MAE: 0.0317

Evaluation of model without variable PavedDrive
17/17 [==============================] - 0s 1ms/step
MAE: 0.0316

Evaluation of model without variable PoolQC
17/17 [==============================] - 0s 1ms/step
MAE: 0.0330

Evaluation of model without variable Fence
17/17 [==============================] - 0s 1ms/step
MAE: 0.0366

Evaluation of model without variable MiscFeature
17/17 [==============================] - 0s 1ms/step
MAE: 0.0305

Evaluation of model without variable SaleType
17/17 [==============================] - 0s 1ms/step
MAE: 0.0319

Evaluation of model without variable SaleCondition
17/17 [==============================] - 0s 1ms/step
MAE: 0.3786
In [45]:
#print(accuracy_df)
#accuracy_df['accuracy'] = accuracy_df['accuracy'].apply(lambda x:float(x))
var_to_exclude = list(accuracy_df[accuracy_df['accuracy']<0.033]['var'])

Part 4.3: Learning rate optimization

In [46]:
x_train = x_train_bkp
y_train = y_train_bkp
x_test = x_test_bkp
y_test = y_test_bkp
lr_epochs = 120

model = seq_model()
lr_schedule = tf.keras.callbacks.LearningRateScheduler(lambda epoch: 1e-6 * 10**(epoch / 20))
history = model.fit(x_train,y_train,epochs=lr_epochs,validation_data=(x_test,y_test),verbose=2,callbacks=[lr_schedule])
Epoch 1/120
66/66 - 1s - loss: 0.3175 - mae: 0.3175 - val_loss: 0.3344 - val_mae: 0.3344 - lr: 1.0000e-06 - 601ms/epoch - 9ms/step
Epoch 2/120
66/66 - 0s - loss: 0.3159 - mae: 0.3159 - val_loss: 0.3327 - val_mae: 0.3327 - lr: 1.1220e-06 - 103ms/epoch - 2ms/step
Epoch 3/120
66/66 - 0s - loss: 0.3141 - mae: 0.3141 - val_loss: 0.3308 - val_mae: 0.3308 - lr: 1.2589e-06 - 100ms/epoch - 2ms/step
Epoch 4/120
66/66 - 0s - loss: 0.3120 - mae: 0.3120 - val_loss: 0.3287 - val_mae: 0.3287 - lr: 1.4125e-06 - 101ms/epoch - 2ms/step
Epoch 5/120
66/66 - 0s - loss: 0.3098 - mae: 0.3098 - val_loss: 0.3263 - val_mae: 0.3263 - lr: 1.5849e-06 - 106ms/epoch - 2ms/step
Epoch 6/120
66/66 - 0s - loss: 0.3072 - mae: 0.3072 - val_loss: 0.3236 - val_mae: 0.3236 - lr: 1.7783e-06 - 136ms/epoch - 2ms/step
Epoch 7/120
66/66 - 0s - loss: 0.3044 - mae: 0.3044 - val_loss: 0.3206 - val_mae: 0.3206 - lr: 1.9953e-06 - 160ms/epoch - 2ms/step
Epoch 8/120
66/66 - 0s - loss: 0.3011 - mae: 0.3011 - val_loss: 0.3171 - val_mae: 0.3171 - lr: 2.2387e-06 - 173ms/epoch - 3ms/step
Epoch 9/120
66/66 - 0s - loss: 0.2975 - mae: 0.2975 - val_loss: 0.3133 - val_mae: 0.3133 - lr: 2.5119e-06 - 120ms/epoch - 2ms/step
Epoch 10/120
66/66 - 0s - loss: 0.2933 - mae: 0.2933 - val_loss: 0.3089 - val_mae: 0.3089 - lr: 2.8184e-06 - 155ms/epoch - 2ms/step
Epoch 11/120
66/66 - 0s - loss: 0.2887 - mae: 0.2887 - val_loss: 0.3041 - val_mae: 0.3041 - lr: 3.1623e-06 - 121ms/epoch - 2ms/step
Epoch 12/120
66/66 - 0s - loss: 0.2835 - mae: 0.2835 - val_loss: 0.2986 - val_mae: 0.2986 - lr: 3.5481e-06 - 116ms/epoch - 2ms/step
Epoch 13/120
66/66 - 0s - loss: 0.2778 - mae: 0.2778 - val_loss: 0.2926 - val_mae: 0.2926 - lr: 3.9811e-06 - 161ms/epoch - 2ms/step
Epoch 14/120
66/66 - 0s - loss: 0.2714 - mae: 0.2714 - val_loss: 0.2859 - val_mae: 0.2859 - lr: 4.4668e-06 - 127ms/epoch - 2ms/step
Epoch 15/120
66/66 - 0s - loss: 0.2644 - mae: 0.2644 - val_loss: 0.2786 - val_mae: 0.2786 - lr: 5.0119e-06 - 160ms/epoch - 2ms/step
Epoch 16/120
66/66 - 0s - loss: 0.2569 - mae: 0.2569 - val_loss: 0.2708 - val_mae: 0.2708 - lr: 5.6234e-06 - 156ms/epoch - 2ms/step
Epoch 17/120
66/66 - 0s - loss: 0.2493 - mae: 0.2493 - val_loss: 0.2629 - val_mae: 0.2629 - lr: 6.3096e-06 - 156ms/epoch - 2ms/step
Epoch 18/120
66/66 - 0s - loss: 0.2418 - mae: 0.2418 - val_loss: 0.2547 - val_mae: 0.2547 - lr: 7.0795e-06 - 147ms/epoch - 2ms/step
Epoch 19/120
66/66 - 0s - loss: 0.2341 - mae: 0.2341 - val_loss: 0.2467 - val_mae: 0.2467 - lr: 7.9433e-06 - 96ms/epoch - 1ms/step
Epoch 20/120
66/66 - 0s - loss: 0.2272 - mae: 0.2272 - val_loss: 0.2397 - val_mae: 0.2397 - lr: 8.9125e-06 - 163ms/epoch - 2ms/step
Epoch 21/120
66/66 - 0s - loss: 0.2207 - mae: 0.2207 - val_loss: 0.2330 - val_mae: 0.2330 - lr: 1.0000e-05 - 146ms/epoch - 2ms/step
Epoch 22/120
66/66 - 0s - loss: 0.2146 - mae: 0.2146 - val_loss: 0.2262 - val_mae: 0.2262 - lr: 1.1220e-05 - 133ms/epoch - 2ms/step
Epoch 23/120
66/66 - 0s - loss: 0.2089 - mae: 0.2089 - val_loss: 0.2196 - val_mae: 0.2196 - lr: 1.2589e-05 - 115ms/epoch - 2ms/step
Epoch 24/120
66/66 - 0s - loss: 0.2032 - mae: 0.2032 - val_loss: 0.2129 - val_mae: 0.2129 - lr: 1.4125e-05 - 132ms/epoch - 2ms/step
Epoch 25/120
66/66 - 0s - loss: 0.1975 - mae: 0.1975 - val_loss: 0.2061 - val_mae: 0.2061 - lr: 1.5849e-05 - 162ms/epoch - 2ms/step
Epoch 26/120
66/66 - 0s - loss: 0.1918 - mae: 0.1918 - val_loss: 0.1991 - val_mae: 0.1991 - lr: 1.7783e-05 - 132ms/epoch - 2ms/step
Epoch 27/120
66/66 - 0s - loss: 0.1861 - mae: 0.1861 - val_loss: 0.1922 - val_mae: 0.1922 - lr: 1.9953e-05 - 122ms/epoch - 2ms/step
Epoch 28/120
66/66 - 0s - loss: 0.1802 - mae: 0.1802 - val_loss: 0.1846 - val_mae: 0.1846 - lr: 2.2387e-05 - 168ms/epoch - 3ms/step
Epoch 29/120
66/66 - 0s - loss: 0.1738 - mae: 0.1738 - val_loss: 0.1761 - val_mae: 0.1761 - lr: 2.5119e-05 - 166ms/epoch - 3ms/step
Epoch 30/120
66/66 - 0s - loss: 0.1659 - mae: 0.1659 - val_loss: 0.1651 - val_mae: 0.1651 - lr: 2.8184e-05 - 160ms/epoch - 2ms/step
Epoch 31/120
66/66 - 0s - loss: 0.1544 - mae: 0.1544 - val_loss: 0.1485 - val_mae: 0.1485 - lr: 3.1623e-05 - 113ms/epoch - 2ms/step
Epoch 32/120
66/66 - 0s - loss: 0.1367 - mae: 0.1367 - val_loss: 0.1270 - val_mae: 0.1270 - lr: 3.5481e-05 - 127ms/epoch - 2ms/step
Epoch 33/120
66/66 - 0s - loss: 0.1160 - mae: 0.1160 - val_loss: 0.1055 - val_mae: 0.1055 - lr: 3.9811e-05 - 161ms/epoch - 2ms/step
Epoch 34/120
66/66 - 0s - loss: 0.0993 - mae: 0.0993 - val_loss: 0.0941 - val_mae: 0.0941 - lr: 4.4668e-05 - 171ms/epoch - 3ms/step
Epoch 35/120
66/66 - 0s - loss: 0.0898 - mae: 0.0898 - val_loss: 0.0876 - val_mae: 0.0876 - lr: 5.0119e-05 - 139ms/epoch - 2ms/step
Epoch 36/120
66/66 - 0s - loss: 0.0850 - mae: 0.0850 - val_loss: 0.0846 - val_mae: 0.0846 - lr: 5.6234e-05 - 110ms/epoch - 2ms/step
Epoch 37/120
66/66 - 0s - loss: 0.0821 - mae: 0.0821 - val_loss: 0.0811 - val_mae: 0.0811 - lr: 6.3096e-05 - 173ms/epoch - 3ms/step
Epoch 38/120
66/66 - 0s - loss: 0.0778 - mae: 0.0778 - val_loss: 0.0779 - val_mae: 0.0779 - lr: 7.0795e-05 - 160ms/epoch - 2ms/step
Epoch 39/120
66/66 - 0s - loss: 0.0735 - mae: 0.0735 - val_loss: 0.0744 - val_mae: 0.0744 - lr: 7.9433e-05 - 141ms/epoch - 2ms/step
Epoch 40/120
66/66 - 0s - loss: 0.0695 - mae: 0.0695 - val_loss: 0.0702 - val_mae: 0.0702 - lr: 8.9125e-05 - 135ms/epoch - 2ms/step
Epoch 41/120
66/66 - 0s - loss: 0.0634 - mae: 0.0634 - val_loss: 0.0659 - val_mae: 0.0659 - lr: 1.0000e-04 - 114ms/epoch - 2ms/step
Epoch 42/120
66/66 - 0s - loss: 0.0582 - mae: 0.0582 - val_loss: 0.0612 - val_mae: 0.0612 - lr: 1.1220e-04 - 161ms/epoch - 2ms/step
Epoch 43/120
66/66 - 0s - loss: 0.0541 - mae: 0.0541 - val_loss: 0.0604 - val_mae: 0.0604 - lr: 1.2589e-04 - 156ms/epoch - 2ms/step
Epoch 44/120
66/66 - 0s - loss: 0.0507 - mae: 0.0507 - val_loss: 0.0566 - val_mae: 0.0566 - lr: 1.4125e-04 - 108ms/epoch - 2ms/step
Epoch 45/120
66/66 - 0s - loss: 0.0504 - mae: 0.0504 - val_loss: 0.0576 - val_mae: 0.0576 - lr: 1.5849e-04 - 103ms/epoch - 2ms/step
Epoch 46/120
66/66 - 0s - loss: 0.0486 - mae: 0.0486 - val_loss: 0.0524 - val_mae: 0.0524 - lr: 1.7783e-04 - 110ms/epoch - 2ms/step
Epoch 47/120
66/66 - 0s - loss: 0.0464 - mae: 0.0464 - val_loss: 0.0510 - val_mae: 0.0510 - lr: 1.9953e-04 - 119ms/epoch - 2ms/step
Epoch 48/120
66/66 - 0s - loss: 0.0453 - mae: 0.0453 - val_loss: 0.0500 - val_mae: 0.0500 - lr: 2.2387e-04 - 166ms/epoch - 3ms/step
Epoch 49/120
66/66 - 0s - loss: 0.0443 - mae: 0.0443 - val_loss: 0.0487 - val_mae: 0.0487 - lr: 2.5119e-04 - 123ms/epoch - 2ms/step
Epoch 50/120
66/66 - 0s - loss: 0.0430 - mae: 0.0430 - val_loss: 0.0476 - val_mae: 0.0476 - lr: 2.8184e-04 - 192ms/epoch - 3ms/step
Epoch 51/120
66/66 - 0s - loss: 0.0414 - mae: 0.0414 - val_loss: 0.0447 - val_mae: 0.0447 - lr: 3.1623e-04 - 107ms/epoch - 2ms/step
Epoch 52/120
66/66 - 0s - loss: 0.0419 - mae: 0.0419 - val_loss: 0.0456 - val_mae: 0.0456 - lr: 3.5481e-04 - 146ms/epoch - 2ms/step
Epoch 53/120
66/66 - 0s - loss: 0.0406 - mae: 0.0406 - val_loss: 0.0443 - val_mae: 0.0443 - lr: 3.9811e-04 - 144ms/epoch - 2ms/step
Epoch 54/120
66/66 - 0s - loss: 0.0397 - mae: 0.0397 - val_loss: 0.0412 - val_mae: 0.0412 - lr: 4.4668e-04 - 139ms/epoch - 2ms/step
Epoch 55/120
66/66 - 0s - loss: 0.0373 - mae: 0.0373 - val_loss: 0.0393 - val_mae: 0.0393 - lr: 5.0119e-04 - 162ms/epoch - 2ms/step
Epoch 56/120
66/66 - 0s - loss: 0.0357 - mae: 0.0357 - val_loss: 0.0404 - val_mae: 0.0404 - lr: 5.6234e-04 - 168ms/epoch - 3ms/step
Epoch 57/120
66/66 - 0s - loss: 0.0361 - mae: 0.0361 - val_loss: 0.0367 - val_mae: 0.0367 - lr: 6.3096e-04 - 148ms/epoch - 2ms/step
Epoch 58/120
66/66 - 0s - loss: 0.0336 - mae: 0.0336 - val_loss: 0.0400 - val_mae: 0.0400 - lr: 7.0795e-04 - 133ms/epoch - 2ms/step
Epoch 59/120
66/66 - 0s - loss: 0.0331 - mae: 0.0331 - val_loss: 0.0362 - val_mae: 0.0362 - lr: 7.9433e-04 - 119ms/epoch - 2ms/step
Epoch 60/120
66/66 - 0s - loss: 0.0343 - mae: 0.0343 - val_loss: 0.0345 - val_mae: 0.0345 - lr: 8.9125e-04 - 130ms/epoch - 2ms/step
Epoch 61/120
66/66 - 0s - loss: 0.0307 - mae: 0.0307 - val_loss: 0.0516 - val_mae: 0.0516 - lr: 0.0010 - 88ms/epoch - 1ms/step
Epoch 62/120
66/66 - 0s - loss: 0.0321 - mae: 0.0321 - val_loss: 0.0292 - val_mae: 0.0292 - lr: 0.0011 - 96ms/epoch - 1ms/step
Epoch 63/120
66/66 - 0s - loss: 0.0333 - mae: 0.0333 - val_loss: 0.0299 - val_mae: 0.0299 - lr: 0.0013 - 134ms/epoch - 2ms/step
Epoch 64/120
66/66 - 0s - loss: 0.0362 - mae: 0.0362 - val_loss: 0.0342 - val_mae: 0.0342 - lr: 0.0014 - 162ms/epoch - 2ms/step
Epoch 65/120
66/66 - 0s - loss: 0.0294 - mae: 0.0294 - val_loss: 0.0515 - val_mae: 0.0515 - lr: 0.0016 - 171ms/epoch - 3ms/step
Epoch 66/120
66/66 - 0s - loss: 0.0294 - mae: 0.0294 - val_loss: 0.0255 - val_mae: 0.0255 - lr: 0.0018 - 162ms/epoch - 2ms/step
Epoch 67/120
66/66 - 0s - loss: 0.0291 - mae: 0.0291 - val_loss: 0.0255 - val_mae: 0.0255 - lr: 0.0020 - 154ms/epoch - 2ms/step
Epoch 68/120
66/66 - 0s - loss: 0.0304 - mae: 0.0304 - val_loss: 0.0241 - val_mae: 0.0241 - lr: 0.0022 - 117ms/epoch - 2ms/step
Epoch 69/120
66/66 - 0s - loss: 0.0343 - mae: 0.0343 - val_loss: 0.0381 - val_mae: 0.0381 - lr: 0.0025 - 123ms/epoch - 2ms/step
Epoch 70/120
66/66 - 0s - loss: 0.0329 - mae: 0.0329 - val_loss: 0.0234 - val_mae: 0.0234 - lr: 0.0028 - 107ms/epoch - 2ms/step
Epoch 71/120
66/66 - 0s - loss: 0.0357 - mae: 0.0357 - val_loss: 0.0261 - val_mae: 0.0261 - lr: 0.0032 - 156ms/epoch - 2ms/step
Epoch 72/120
66/66 - 0s - loss: 0.0320 - mae: 0.0320 - val_loss: 0.0436 - val_mae: 0.0436 - lr: 0.0035 - 121ms/epoch - 2ms/step
Epoch 73/120
66/66 - 0s - loss: 0.0351 - mae: 0.0351 - val_loss: 0.0247 - val_mae: 0.0247 - lr: 0.0040 - 120ms/epoch - 2ms/step
Epoch 74/120
66/66 - 0s - loss: 0.0338 - mae: 0.0338 - val_loss: 0.0416 - val_mae: 0.0416 - lr: 0.0045 - 112ms/epoch - 2ms/step
Epoch 75/120
66/66 - 0s - loss: 0.0317 - mae: 0.0317 - val_loss: 0.0250 - val_mae: 0.0250 - lr: 0.0050 - 115ms/epoch - 2ms/step
Epoch 76/120
66/66 - 0s - loss: 0.0315 - mae: 0.0315 - val_loss: 0.0241 - val_mae: 0.0241 - lr: 0.0056 - 99ms/epoch - 2ms/step
Epoch 77/120
66/66 - 0s - loss: 0.0322 - mae: 0.0322 - val_loss: 0.0520 - val_mae: 0.0520 - lr: 0.0063 - 117ms/epoch - 2ms/step
Epoch 78/120
66/66 - 0s - loss: 0.0294 - mae: 0.0294 - val_loss: 0.0273 - val_mae: 0.0273 - lr: 0.0071 - 124ms/epoch - 2ms/step
Epoch 79/120
66/66 - 0s - loss: 0.0276 - mae: 0.0276 - val_loss: 0.0203 - val_mae: 0.0203 - lr: 0.0079 - 147ms/epoch - 2ms/step
Epoch 80/120
66/66 - 0s - loss: 0.0425 - mae: 0.0425 - val_loss: 0.1456 - val_mae: 0.1456 - lr: 0.0089 - 131ms/epoch - 2ms/step
Epoch 81/120
66/66 - 0s - loss: 0.0425 - mae: 0.0425 - val_loss: 0.0475 - val_mae: 0.0475 - lr: 0.0100 - 108ms/epoch - 2ms/step
Epoch 82/120
66/66 - 0s - loss: 0.0400 - mae: 0.0400 - val_loss: 0.0399 - val_mae: 0.0399 - lr: 0.0112 - 115ms/epoch - 2ms/step
Epoch 83/120
66/66 - 0s - loss: 0.0344 - mae: 0.0344 - val_loss: 0.0297 - val_mae: 0.0297 - lr: 0.0126 - 106ms/epoch - 2ms/step
Epoch 84/120
66/66 - 0s - loss: 0.0307 - mae: 0.0307 - val_loss: 0.0345 - val_mae: 0.0345 - lr: 0.0141 - 106ms/epoch - 2ms/step
Epoch 85/120
66/66 - 0s - loss: 0.0473 - mae: 0.0473 - val_loss: 0.0351 - val_mae: 0.0351 - lr: 0.0158 - 111ms/epoch - 2ms/step
Epoch 86/120
66/66 - 0s - loss: 0.0343 - mae: 0.0343 - val_loss: 0.0696 - val_mae: 0.0696 - lr: 0.0178 - 90ms/epoch - 1ms/step
Epoch 87/120
66/66 - 0s - loss: 0.0380 - mae: 0.0380 - val_loss: 0.0264 - val_mae: 0.0264 - lr: 0.0200 - 82ms/epoch - 1ms/step
Epoch 88/120
66/66 - 0s - loss: 0.0350 - mae: 0.0350 - val_loss: 0.0407 - val_mae: 0.0407 - lr: 0.0224 - 100ms/epoch - 2ms/step
Epoch 89/120
66/66 - 0s - loss: 0.0443 - mae: 0.0443 - val_loss: 0.0283 - val_mae: 0.0283 - lr: 0.0251 - 92ms/epoch - 1ms/step
Epoch 90/120
66/66 - 0s - loss: 0.0392 - mae: 0.0392 - val_loss: 0.0272 - val_mae: 0.0272 - lr: 0.0282 - 130ms/epoch - 2ms/step
Epoch 91/120
66/66 - 0s - loss: 0.0432 - mae: 0.0432 - val_loss: 0.0306 - val_mae: 0.0306 - lr: 0.0316 - 87ms/epoch - 1ms/step
Epoch 92/120
66/66 - 0s - loss: 0.0419 - mae: 0.0419 - val_loss: 0.0418 - val_mae: 0.0418 - lr: 0.0355 - 99ms/epoch - 2ms/step
Epoch 93/120
66/66 - 0s - loss: 0.0575 - mae: 0.0575 - val_loss: 0.0399 - val_mae: 0.0399 - lr: 0.0398 - 98ms/epoch - 1ms/step
Epoch 94/120
66/66 - 0s - loss: 0.0422 - mae: 0.0422 - val_loss: 0.0528 - val_mae: 0.0528 - lr: 0.0447 - 88ms/epoch - 1ms/step
Epoch 95/120
66/66 - 0s - loss: 0.0526 - mae: 0.0526 - val_loss: 0.0688 - val_mae: 0.0688 - lr: 0.0501 - 89ms/epoch - 1ms/step
Epoch 96/120
66/66 - 0s - loss: 0.0499 - mae: 0.0499 - val_loss: 0.0303 - val_mae: 0.0303 - lr: 0.0562 - 95ms/epoch - 1ms/step
Epoch 97/120
66/66 - 0s - loss: 0.0530 - mae: 0.0530 - val_loss: 0.1139 - val_mae: 0.1139 - lr: 0.0631 - 87ms/epoch - 1ms/step
Epoch 98/120
66/66 - 0s - loss: 0.1541 - mae: 0.1541 - val_loss: 0.1657 - val_mae: 0.1657 - lr: 0.0708 - 98ms/epoch - 1ms/step
Epoch 99/120
66/66 - 0s - loss: 0.1683 - mae: 0.1683 - val_loss: 0.1691 - val_mae: 0.1691 - lr: 0.0794 - 102ms/epoch - 2ms/step
Epoch 100/120
66/66 - 0s - loss: 0.1699 - mae: 0.1699 - val_loss: 0.1663 - val_mae: 0.1663 - lr: 0.0891 - 128ms/epoch - 2ms/step
Epoch 101/120
66/66 - 0s - loss: 0.1683 - mae: 0.1683 - val_loss: 0.1664 - val_mae: 0.1664 - lr: 0.1000 - 131ms/epoch - 2ms/step
Epoch 102/120
66/66 - 0s - loss: 0.1690 - mae: 0.1690 - val_loss: 0.1698 - val_mae: 0.1698 - lr: 0.1122 - 91ms/epoch - 1ms/step
Epoch 103/120
66/66 - 0s - loss: 0.1694 - mae: 0.1694 - val_loss: 0.1706 - val_mae: 0.1706 - lr: 0.1259 - 106ms/epoch - 2ms/step
Epoch 104/120
66/66 - 0s - loss: 0.1713 - mae: 0.1713 - val_loss: 0.1670 - val_mae: 0.1670 - lr: 0.1413 - 108ms/epoch - 2ms/step
Epoch 105/120
66/66 - 0s - loss: 0.1707 - mae: 0.1707 - val_loss: 0.1675 - val_mae: 0.1675 - lr: 0.1585 - 95ms/epoch - 1ms/step
Epoch 106/120
66/66 - 0s - loss: 0.1690 - mae: 0.1690 - val_loss: 0.1668 - val_mae: 0.1668 - lr: 0.1778 - 97ms/epoch - 1ms/step
Epoch 107/120
66/66 - 0s - loss: 0.1698 - mae: 0.1698 - val_loss: 0.1667 - val_mae: 0.1667 - lr: 0.1995 - 101ms/epoch - 2ms/step
Epoch 108/120
66/66 - 0s - loss: 0.1691 - mae: 0.1691 - val_loss: 0.1818 - val_mae: 0.1818 - lr: 0.2239 - 111ms/epoch - 2ms/step
Epoch 109/120
66/66 - 0s - loss: 0.1730 - mae: 0.1730 - val_loss: 0.1724 - val_mae: 0.1724 - lr: 0.2512 - 117ms/epoch - 2ms/step
Epoch 110/120
66/66 - 0s - loss: 0.1726 - mae: 0.1726 - val_loss: 0.1754 - val_mae: 0.1754 - lr: 0.2818 - 107ms/epoch - 2ms/step
Epoch 111/120
66/66 - 0s - loss: 0.1761 - mae: 0.1761 - val_loss: 0.1662 - val_mae: 0.1662 - lr: 0.3162 - 160ms/epoch - 2ms/step
Epoch 112/120
66/66 - 0s - loss: 0.1738 - mae: 0.1738 - val_loss: 0.1758 - val_mae: 0.1758 - lr: 0.3548 - 169ms/epoch - 3ms/step
Epoch 113/120
66/66 - 0s - loss: 0.1821 - mae: 0.1821 - val_loss: 0.1653 - val_mae: 0.1653 - lr: 0.3981 - 102ms/epoch - 2ms/step
Epoch 114/120
66/66 - 0s - loss: 0.1748 - mae: 0.1748 - val_loss: 0.1813 - val_mae: 0.1813 - lr: 0.4467 - 112ms/epoch - 2ms/step
Epoch 115/120
66/66 - 0s - loss: 0.1803 - mae: 0.1803 - val_loss: 0.1690 - val_mae: 0.1690 - lr: 0.5012 - 100ms/epoch - 2ms/step
Epoch 116/120
66/66 - 0s - loss: 0.1829 - mae: 0.1829 - val_loss: 0.1654 - val_mae: 0.1654 - lr: 0.5623 - 158ms/epoch - 2ms/step
Epoch 117/120
66/66 - 0s - loss: 0.1767 - mae: 0.1767 - val_loss: 0.1739 - val_mae: 0.1739 - lr: 0.6310 - 162ms/epoch - 2ms/step
Epoch 118/120
66/66 - 0s - loss: 0.1783 - mae: 0.1783 - val_loss: 0.1683 - val_mae: 0.1683 - lr: 0.7079 - 122ms/epoch - 2ms/step
Epoch 119/120
66/66 - 0s - loss: 0.1763 - mae: 0.1763 - val_loss: 0.2423 - val_mae: 0.2423 - lr: 0.7943 - 116ms/epoch - 2ms/step
Epoch 120/120
66/66 - 0s - loss: 0.1826 - mae: 0.1826 - val_loss: 0.1664 - val_mae: 0.1664 - lr: 0.8913 - 140ms/epoch - 2ms/step
In [47]:
lr_history = history.history["lr"]
loss_history = history.history["loss"]

fig_lr = go.Figure()
fig_lr.add_trace(go.Line(
    x = lr_history,
    y = loss_history,
    legendgroup = 'plot',
    legendgrouptitle_text = 'plot',
    name = 'Loss',
    marker = dict(color='Red'),
))
fig_lr.layout.update(xaxis_range=[0,5e-3],title_text='Figure 4.3: Loss vs. Learning Rate',xaxis_title='Learning rate',yaxis_title='Loss')
fig_lr.show()

The above graph (Figure 4.3) shows the loss plotted against the learning rate. The optimal learning rate is immediately right of the elbow of the graph, which is observed to be approximately 0.001. That learning rate is used in the final model.

Part 4.4: Improving the model accuracy further

Now it is known that dropping certain variables makes the model prediction more accurate. Thus the top 3 variables that contribute to the inaccuracy is selected to be dropped from the final model. Conversely, the variables with the least contribution to the inaccuracy can be considered vital to the accuracy of the model and must be included during training.

Further, Figure 2.2 also illustrated that classLabel was imbalanced with more 'yes' than 'no' examples. In order to get a more balanced model, RandomOverSampler is used to train the model on more of the 'no' examples.

In [57]:
model = seq_model(lr=0.001)

#Automatically exclude the top 3 variables that contribute to the model inaccuracy
#var_to_exclude = list(accuracy_df[:3]['var'])

x_train = x_train_bkp
y_train = y_train_bkp
x_test = x_test_bkp
y_test = y_test_bkp

#print(x_train.columns)

x_train = x_train.drop(columns=var_to_exclude)
x_test = x_test.drop(columns=var_to_exclude)

#Random Over Sampler to reduce the effect of classLabel imbalance 
#ros = RandomOverSampler()
#x_train, y_train = ros.fit_resample(x_train,y_train)

#Save the best model to avoid overfitting
checkpoint_filepath = wd + '//ckpt//model.ckpt'
model_checkpoint_callback = tf.keras.callbacks.ModelCheckpoint(
    filepath=checkpoint_filepath,
    save_weights_only=True,
    monitor='val_mae',
    mode='min',
    save_best_only=True)

#print("\nEvaluation of model without variables %s, %s and %s" %(var_to_exclude[0],var_to_exclude[1],var_to_exclude[2]))
history = model.fit(x_train,y_train,epochs=200,validation_data=(x_test,y_test),callbacks=[model_checkpoint_callback],verbose=2)

model.load_weights(checkpoint_filepath)
test_data = (x_test,y_test)
yhat_test = model_predict(model,test_data)

#dummy_arr = np.zeros((np.shape(y_test)[0],no_of_dummy_col))
y_test = np.expand_dims(y_test,axis=1)
#y_test = np.concatenate((dummy_arr,y_test),axis=1)
#yhat_test = np.concatenate((dummy_arr,yhat_test),axis=1)

scaler = MinMaxScaler().fit(comb_num_df_bkp['SalePrice'].to_numpy().reshape(-1,1))

y_test = scaler.inverse_transform(y_test)
yhat_test = scaler.inverse_transform(yhat_test)
print(mean_absolute_error(yhat_test[:,-1],y_test[:,-1]))
Epoch 1/200
66/66 - 1s - loss: 0.1843 - mae: 0.1843 - val_loss: 0.1201 - val_mae: 0.1201 - 742ms/epoch - 11ms/step
Epoch 2/200
66/66 - 0s - loss: 0.1021 - mae: 0.1021 - val_loss: 0.0882 - val_mae: 0.0882 - 200ms/epoch - 3ms/step
Epoch 3/200
66/66 - 0s - loss: 0.0853 - mae: 0.0853 - val_loss: 0.0780 - val_mae: 0.0780 - 200ms/epoch - 3ms/step
Epoch 4/200
66/66 - 0s - loss: 0.0707 - mae: 0.0707 - val_loss: 0.0662 - val_mae: 0.0662 - 162ms/epoch - 2ms/step
Epoch 5/200
66/66 - 0s - loss: 0.0543 - mae: 0.0543 - val_loss: 0.0538 - val_mae: 0.0538 - 197ms/epoch - 3ms/step
Epoch 6/200
66/66 - 0s - loss: 0.0455 - mae: 0.0455 - val_loss: 0.0462 - val_mae: 0.0462 - 199ms/epoch - 3ms/step
Epoch 7/200
66/66 - 0s - loss: 0.0398 - mae: 0.0398 - val_loss: 0.0372 - val_mae: 0.0372 - 190ms/epoch - 3ms/step
Epoch 8/200
66/66 - 0s - loss: 0.0359 - mae: 0.0359 - val_loss: 0.0338 - val_mae: 0.0338 - 170ms/epoch - 3ms/step
Epoch 9/200
66/66 - 0s - loss: 0.0318 - mae: 0.0318 - val_loss: 0.0390 - val_mae: 0.0390 - 125ms/epoch - 2ms/step
Epoch 10/200
66/66 - 0s - loss: 0.0302 - mae: 0.0302 - val_loss: 0.0291 - val_mae: 0.0291 - 204ms/epoch - 3ms/step
Epoch 11/200
66/66 - 0s - loss: 0.0275 - mae: 0.0275 - val_loss: 0.0288 - val_mae: 0.0288 - 188ms/epoch - 3ms/step
Epoch 12/200
66/66 - 0s - loss: 0.0273 - mae: 0.0273 - val_loss: 0.0334 - val_mae: 0.0334 - 125ms/epoch - 2ms/step
Epoch 13/200
66/66 - 0s - loss: 0.0252 - mae: 0.0252 - val_loss: 0.0267 - val_mae: 0.0267 - 225ms/epoch - 3ms/step
Epoch 14/200
66/66 - 0s - loss: 0.0261 - mae: 0.0261 - val_loss: 0.0255 - val_mae: 0.0255 - 209ms/epoch - 3ms/step
Epoch 15/200
66/66 - 0s - loss: 0.0232 - mae: 0.0232 - val_loss: 0.0234 - val_mae: 0.0234 - 177ms/epoch - 3ms/step
Epoch 16/200
66/66 - 0s - loss: 0.0227 - mae: 0.0227 - val_loss: 0.0234 - val_mae: 0.0234 - 216ms/epoch - 3ms/step
Epoch 17/200
66/66 - 0s - loss: 0.0217 - mae: 0.0217 - val_loss: 0.0212 - val_mae: 0.0212 - 160ms/epoch - 2ms/step
Epoch 18/200
66/66 - 0s - loss: 0.0210 - mae: 0.0210 - val_loss: 0.0216 - val_mae: 0.0216 - 133ms/epoch - 2ms/step
Epoch 19/200
66/66 - 0s - loss: 0.0200 - mae: 0.0200 - val_loss: 0.0202 - val_mae: 0.0202 - 211ms/epoch - 3ms/step
Epoch 20/200
66/66 - 0s - loss: 0.0192 - mae: 0.0192 - val_loss: 0.0194 - val_mae: 0.0194 - 165ms/epoch - 2ms/step
Epoch 21/200
66/66 - 0s - loss: 0.0196 - mae: 0.0196 - val_loss: 0.0217 - val_mae: 0.0217 - 146ms/epoch - 2ms/step
Epoch 22/200
66/66 - 0s - loss: 0.0190 - mae: 0.0190 - val_loss: 0.0212 - val_mae: 0.0212 - 137ms/epoch - 2ms/step
Epoch 23/200
66/66 - 0s - loss: 0.0177 - mae: 0.0177 - val_loss: 0.0184 - val_mae: 0.0184 - 170ms/epoch - 3ms/step
Epoch 24/200
66/66 - 0s - loss: 0.0177 - mae: 0.0177 - val_loss: 0.0194 - val_mae: 0.0194 - 169ms/epoch - 3ms/step
Epoch 25/200
66/66 - 0s - loss: 0.0182 - mae: 0.0182 - val_loss: 0.0203 - val_mae: 0.0203 - 133ms/epoch - 2ms/step
Epoch 26/200
66/66 - 0s - loss: 0.0165 - mae: 0.0165 - val_loss: 0.0172 - val_mae: 0.0172 - 218ms/epoch - 3ms/step
Epoch 27/200
66/66 - 0s - loss: 0.0168 - mae: 0.0168 - val_loss: 0.0184 - val_mae: 0.0184 - 169ms/epoch - 3ms/step
Epoch 28/200
66/66 - 0s - loss: 0.0173 - mae: 0.0173 - val_loss: 0.0199 - val_mae: 0.0199 - 123ms/epoch - 2ms/step
Epoch 29/200
66/66 - 0s - loss: 0.0165 - mae: 0.0165 - val_loss: 0.0167 - val_mae: 0.0167 - 228ms/epoch - 3ms/step
Epoch 30/200
66/66 - 0s - loss: 0.0150 - mae: 0.0150 - val_loss: 0.0160 - val_mae: 0.0160 - 224ms/epoch - 3ms/step
Epoch 31/200
66/66 - 0s - loss: 0.0151 - mae: 0.0151 - val_loss: 0.0179 - val_mae: 0.0179 - 107ms/epoch - 2ms/step
Epoch 32/200
66/66 - 0s - loss: 0.0147 - mae: 0.0147 - val_loss: 0.0205 - val_mae: 0.0205 - 153ms/epoch - 2ms/step
Epoch 33/200
66/66 - 0s - loss: 0.0147 - mae: 0.0147 - val_loss: 0.0208 - val_mae: 0.0208 - 148ms/epoch - 2ms/step
Epoch 34/200
66/66 - 0s - loss: 0.0142 - mae: 0.0142 - val_loss: 0.0143 - val_mae: 0.0143 - 195ms/epoch - 3ms/step
Epoch 35/200
66/66 - 0s - loss: 0.0147 - mae: 0.0147 - val_loss: 0.0162 - val_mae: 0.0162 - 134ms/epoch - 2ms/step
Epoch 36/200
66/66 - 0s - loss: 0.0154 - mae: 0.0154 - val_loss: 0.0157 - val_mae: 0.0157 - 126ms/epoch - 2ms/step
Epoch 37/200
66/66 - 0s - loss: 0.0136 - mae: 0.0136 - val_loss: 0.0142 - val_mae: 0.0142 - 195ms/epoch - 3ms/step
Epoch 38/200
66/66 - 0s - loss: 0.0134 - mae: 0.0134 - val_loss: 0.0156 - val_mae: 0.0156 - 147ms/epoch - 2ms/step
Epoch 39/200
66/66 - 0s - loss: 0.0140 - mae: 0.0140 - val_loss: 0.0156 - val_mae: 0.0156 - 129ms/epoch - 2ms/step
Epoch 40/200
66/66 - 0s - loss: 0.0137 - mae: 0.0137 - val_loss: 0.0136 - val_mae: 0.0136 - 189ms/epoch - 3ms/step
Epoch 41/200
66/66 - 0s - loss: 0.0134 - mae: 0.0134 - val_loss: 0.0169 - val_mae: 0.0169 - 128ms/epoch - 2ms/step
Epoch 42/200
66/66 - 0s - loss: 0.0136 - mae: 0.0136 - val_loss: 0.0143 - val_mae: 0.0143 - 149ms/epoch - 2ms/step
Epoch 43/200
66/66 - 0s - loss: 0.0130 - mae: 0.0130 - val_loss: 0.0125 - val_mae: 0.0125 - 213ms/epoch - 3ms/step
Epoch 44/200
66/66 - 0s - loss: 0.0135 - mae: 0.0135 - val_loss: 0.0121 - val_mae: 0.0121 - 163ms/epoch - 2ms/step
Epoch 45/200
66/66 - 0s - loss: 0.0130 - mae: 0.0130 - val_loss: 0.0124 - val_mae: 0.0124 - 227ms/epoch - 3ms/step
Epoch 46/200
66/66 - 0s - loss: 0.0122 - mae: 0.0122 - val_loss: 0.0116 - val_mae: 0.0116 - 169ms/epoch - 3ms/step
Epoch 47/200
66/66 - 0s - loss: 0.0129 - mae: 0.0129 - val_loss: 0.0137 - val_mae: 0.0137 - 142ms/epoch - 2ms/step
Epoch 48/200
66/66 - 0s - loss: 0.0117 - mae: 0.0117 - val_loss: 0.0162 - val_mae: 0.0162 - 117ms/epoch - 2ms/step
Epoch 49/200
66/66 - 0s - loss: 0.0128 - mae: 0.0128 - val_loss: 0.0124 - val_mae: 0.0124 - 152ms/epoch - 2ms/step
Epoch 50/200
66/66 - 0s - loss: 0.0113 - mae: 0.0113 - val_loss: 0.0111 - val_mae: 0.0111 - 256ms/epoch - 4ms/step
Epoch 51/200
66/66 - 0s - loss: 0.0110 - mae: 0.0110 - val_loss: 0.0134 - val_mae: 0.0134 - 176ms/epoch - 3ms/step
Epoch 52/200
66/66 - 0s - loss: 0.0123 - mae: 0.0123 - val_loss: 0.0107 - val_mae: 0.0107 - 197ms/epoch - 3ms/step
Epoch 53/200
66/66 - 0s - loss: 0.0124 - mae: 0.0124 - val_loss: 0.0109 - val_mae: 0.0109 - 167ms/epoch - 3ms/step
Epoch 54/200
66/66 - 0s - loss: 0.0110 - mae: 0.0110 - val_loss: 0.0125 - val_mae: 0.0125 - 149ms/epoch - 2ms/step
Epoch 55/200
66/66 - 0s - loss: 0.0115 - mae: 0.0115 - val_loss: 0.0106 - val_mae: 0.0106 - 203ms/epoch - 3ms/step
Epoch 56/200
66/66 - 0s - loss: 0.0099 - mae: 0.0099 - val_loss: 0.0102 - val_mae: 0.0102 - 216ms/epoch - 3ms/step
Epoch 57/200
66/66 - 0s - loss: 0.0115 - mae: 0.0115 - val_loss: 0.0143 - val_mae: 0.0143 - 126ms/epoch - 2ms/step
Epoch 58/200
66/66 - 0s - loss: 0.0115 - mae: 0.0115 - val_loss: 0.0111 - val_mae: 0.0111 - 109ms/epoch - 2ms/step
Epoch 59/200
66/66 - 0s - loss: 0.0104 - mae: 0.0104 - val_loss: 0.0102 - val_mae: 0.0102 - 153ms/epoch - 2ms/step
Epoch 60/200
66/66 - 0s - loss: 0.0100 - mae: 0.0100 - val_loss: 0.0093 - val_mae: 0.0093 - 187ms/epoch - 3ms/step
Epoch 61/200
66/66 - 0s - loss: 0.0097 - mae: 0.0097 - val_loss: 0.0111 - val_mae: 0.0111 - 154ms/epoch - 2ms/step
Epoch 62/200
66/66 - 0s - loss: 0.0092 - mae: 0.0092 - val_loss: 0.0096 - val_mae: 0.0096 - 135ms/epoch - 2ms/step
Epoch 63/200
66/66 - 0s - loss: 0.0096 - mae: 0.0096 - val_loss: 0.0107 - val_mae: 0.0107 - 114ms/epoch - 2ms/step
Epoch 64/200
66/66 - 0s - loss: 0.0086 - mae: 0.0086 - val_loss: 0.0100 - val_mae: 0.0100 - 156ms/epoch - 2ms/step
Epoch 65/200
66/66 - 0s - loss: 0.0095 - mae: 0.0095 - val_loss: 0.0121 - val_mae: 0.0121 - 130ms/epoch - 2ms/step
Epoch 66/200
66/66 - 0s - loss: 0.0132 - mae: 0.0132 - val_loss: 0.0085 - val_mae: 0.0085 - 147ms/epoch - 2ms/step
Epoch 67/200
66/66 - 0s - loss: 0.0087 - mae: 0.0087 - val_loss: 0.0107 - val_mae: 0.0107 - 128ms/epoch - 2ms/step
Epoch 68/200
66/66 - 0s - loss: 0.0093 - mae: 0.0093 - val_loss: 0.0089 - val_mae: 0.0089 - 76ms/epoch - 1ms/step
Epoch 69/200
66/66 - 0s - loss: 0.0101 - mae: 0.0101 - val_loss: 0.0124 - val_mae: 0.0124 - 84ms/epoch - 1ms/step
Epoch 70/200
66/66 - 0s - loss: 0.0093 - mae: 0.0093 - val_loss: 0.0082 - val_mae: 0.0082 - 142ms/epoch - 2ms/step
Epoch 71/200
66/66 - 0s - loss: 0.0090 - mae: 0.0090 - val_loss: 0.0096 - val_mae: 0.0096 - 106ms/epoch - 2ms/step
Epoch 72/200
66/66 - 0s - loss: 0.0080 - mae: 0.0080 - val_loss: 0.0114 - val_mae: 0.0114 - 107ms/epoch - 2ms/step
Epoch 73/200
66/66 - 0s - loss: 0.0088 - mae: 0.0088 - val_loss: 0.0078 - val_mae: 0.0078 - 140ms/epoch - 2ms/step
Epoch 74/200
66/66 - 0s - loss: 0.0081 - mae: 0.0081 - val_loss: 0.0080 - val_mae: 0.0080 - 114ms/epoch - 2ms/step
Epoch 75/200
66/66 - 0s - loss: 0.0074 - mae: 0.0074 - val_loss: 0.0081 - val_mae: 0.0081 - 137ms/epoch - 2ms/step
Epoch 76/200
66/66 - 0s - loss: 0.0085 - mae: 0.0085 - val_loss: 0.0084 - val_mae: 0.0084 - 95ms/epoch - 1ms/step
Epoch 77/200
66/66 - 0s - loss: 0.0077 - mae: 0.0077 - val_loss: 0.0169 - val_mae: 0.0169 - 128ms/epoch - 2ms/step
Epoch 78/200
66/66 - 0s - loss: 0.0106 - mae: 0.0106 - val_loss: 0.0070 - val_mae: 0.0070 - 154ms/epoch - 2ms/step
Epoch 79/200
66/66 - 0s - loss: 0.0071 - mae: 0.0071 - val_loss: 0.0075 - val_mae: 0.0075 - 98ms/epoch - 1ms/step
Epoch 80/200
66/66 - 0s - loss: 0.0077 - mae: 0.0077 - val_loss: 0.0074 - val_mae: 0.0074 - 104ms/epoch - 2ms/step
Epoch 81/200
66/66 - 0s - loss: 0.0069 - mae: 0.0069 - val_loss: 0.0069 - val_mae: 0.0069 - 129ms/epoch - 2ms/step
Epoch 82/200
66/66 - 0s - loss: 0.0065 - mae: 0.0065 - val_loss: 0.0059 - val_mae: 0.0059 - 149ms/epoch - 2ms/step
Epoch 83/200
66/66 - 0s - loss: 0.0077 - mae: 0.0077 - val_loss: 0.0066 - val_mae: 0.0066 - 108ms/epoch - 2ms/step
Epoch 84/200
66/66 - 0s - loss: 0.0063 - mae: 0.0063 - val_loss: 0.0065 - val_mae: 0.0065 - 96ms/epoch - 1ms/step
Epoch 85/200
66/66 - 0s - loss: 0.0069 - mae: 0.0069 - val_loss: 0.0068 - val_mae: 0.0068 - 96ms/epoch - 1ms/step
Epoch 86/200
66/66 - 0s - loss: 0.0078 - mae: 0.0078 - val_loss: 0.0091 - val_mae: 0.0091 - 162ms/epoch - 2ms/step
Epoch 87/200
66/66 - 0s - loss: 0.0064 - mae: 0.0064 - val_loss: 0.0062 - val_mae: 0.0062 - 143ms/epoch - 2ms/step
Epoch 88/200
66/66 - 0s - loss: 0.0077 - mae: 0.0077 - val_loss: 0.0066 - val_mae: 0.0066 - 114ms/epoch - 2ms/step
Epoch 89/200
66/66 - 0s - loss: 0.0065 - mae: 0.0065 - val_loss: 0.0060 - val_mae: 0.0060 - 142ms/epoch - 2ms/step
Epoch 90/200
66/66 - 0s - loss: 0.0065 - mae: 0.0065 - val_loss: 0.0059 - val_mae: 0.0059 - 214ms/epoch - 3ms/step
Epoch 91/200
66/66 - 0s - loss: 0.0063 - mae: 0.0063 - val_loss: 0.0069 - val_mae: 0.0069 - 156ms/epoch - 2ms/step
Epoch 92/200
66/66 - 0s - loss: 0.0055 - mae: 0.0055 - val_loss: 0.0059 - val_mae: 0.0059 - 147ms/epoch - 2ms/step
Epoch 93/200
66/66 - 0s - loss: 0.0066 - mae: 0.0066 - val_loss: 0.0058 - val_mae: 0.0058 - 188ms/epoch - 3ms/step
Epoch 94/200
66/66 - 0s - loss: 0.0069 - mae: 0.0069 - val_loss: 0.0058 - val_mae: 0.0058 - 132ms/epoch - 2ms/step
Epoch 95/200
66/66 - 0s - loss: 0.0078 - mae: 0.0078 - val_loss: 0.0060 - val_mae: 0.0060 - 96ms/epoch - 1ms/step
Epoch 96/200
66/66 - 0s - loss: 0.0072 - mae: 0.0072 - val_loss: 0.0080 - val_mae: 0.0080 - 99ms/epoch - 2ms/step
Epoch 97/200
66/66 - 0s - loss: 0.0051 - mae: 0.0051 - val_loss: 0.0052 - val_mae: 0.0052 - 137ms/epoch - 2ms/step
Epoch 98/200
66/66 - 0s - loss: 0.0066 - mae: 0.0066 - val_loss: 0.0114 - val_mae: 0.0114 - 91ms/epoch - 1ms/step
Epoch 99/200
66/66 - 0s - loss: 0.0090 - mae: 0.0090 - val_loss: 0.0049 - val_mae: 0.0049 - 135ms/epoch - 2ms/step
Epoch 100/200
66/66 - 0s - loss: 0.0060 - mae: 0.0060 - val_loss: 0.0055 - val_mae: 0.0055 - 127ms/epoch - 2ms/step
Epoch 101/200
66/66 - 0s - loss: 0.0055 - mae: 0.0055 - val_loss: 0.0083 - val_mae: 0.0083 - 101ms/epoch - 2ms/step
Epoch 102/200
66/66 - 0s - loss: 0.0064 - mae: 0.0064 - val_loss: 0.0133 - val_mae: 0.0133 - 88ms/epoch - 1ms/step
Epoch 103/200
66/66 - 0s - loss: 0.0076 - mae: 0.0076 - val_loss: 0.0079 - val_mae: 0.0079 - 123ms/epoch - 2ms/step
Epoch 104/200
66/66 - 0s - loss: 0.0053 - mae: 0.0053 - val_loss: 0.0083 - val_mae: 0.0083 - 160ms/epoch - 2ms/step
Epoch 105/200
66/66 - 0s - loss: 0.0061 - mae: 0.0061 - val_loss: 0.0106 - val_mae: 0.0106 - 93ms/epoch - 1ms/step
Epoch 106/200
66/66 - 0s - loss: 0.0053 - mae: 0.0053 - val_loss: 0.0076 - val_mae: 0.0076 - 85ms/epoch - 1ms/step
Epoch 107/200
66/66 - 0s - loss: 0.0068 - mae: 0.0068 - val_loss: 0.0094 - val_mae: 0.0094 - 92ms/epoch - 1ms/step
Epoch 108/200
66/66 - 0s - loss: 0.0070 - mae: 0.0070 - val_loss: 0.0057 - val_mae: 0.0057 - 168ms/epoch - 3ms/step
Epoch 109/200
66/66 - 0s - loss: 0.0063 - mae: 0.0063 - val_loss: 0.0066 - val_mae: 0.0066 - 116ms/epoch - 2ms/step
Epoch 110/200
66/66 - 0s - loss: 0.0083 - mae: 0.0083 - val_loss: 0.0042 - val_mae: 0.0042 - 157ms/epoch - 2ms/step
Epoch 111/200
66/66 - 0s - loss: 0.0050 - mae: 0.0050 - val_loss: 0.0049 - val_mae: 0.0049 - 137ms/epoch - 2ms/step
Epoch 112/200
66/66 - 0s - loss: 0.0045 - mae: 0.0045 - val_loss: 0.0036 - val_mae: 0.0036 - 132ms/epoch - 2ms/step
Epoch 113/200
66/66 - 0s - loss: 0.0044 - mae: 0.0044 - val_loss: 0.0053 - val_mae: 0.0053 - 114ms/epoch - 2ms/step
Epoch 114/200
66/66 - 0s - loss: 0.0056 - mae: 0.0056 - val_loss: 0.0059 - val_mae: 0.0059 - 111ms/epoch - 2ms/step
Epoch 115/200
66/66 - 0s - loss: 0.0078 - mae: 0.0078 - val_loss: 0.0138 - val_mae: 0.0138 - 110ms/epoch - 2ms/step
Epoch 116/200
66/66 - 0s - loss: 0.0069 - mae: 0.0069 - val_loss: 0.0099 - val_mae: 0.0099 - 126ms/epoch - 2ms/step
Epoch 117/200
66/66 - 0s - loss: 0.0068 - mae: 0.0068 - val_loss: 0.0058 - val_mae: 0.0058 - 131ms/epoch - 2ms/step
Epoch 118/200
66/66 - 0s - loss: 0.0047 - mae: 0.0047 - val_loss: 0.0054 - val_mae: 0.0054 - 109ms/epoch - 2ms/step
Epoch 119/200
66/66 - 0s - loss: 0.0059 - mae: 0.0059 - val_loss: 0.0052 - val_mae: 0.0052 - 99ms/epoch - 1ms/step
Epoch 120/200
66/66 - 0s - loss: 0.0050 - mae: 0.0050 - val_loss: 0.0063 - val_mae: 0.0063 - 148ms/epoch - 2ms/step
Epoch 121/200
66/66 - 0s - loss: 0.0053 - mae: 0.0053 - val_loss: 0.0037 - val_mae: 0.0037 - 97ms/epoch - 1ms/step
Epoch 122/200
66/66 - 0s - loss: 0.0054 - mae: 0.0054 - val_loss: 0.0045 - val_mae: 0.0045 - 144ms/epoch - 2ms/step
Epoch 123/200
66/66 - 0s - loss: 0.0058 - mae: 0.0058 - val_loss: 0.0193 - val_mae: 0.0193 - 162ms/epoch - 2ms/step
Epoch 124/200
66/66 - 0s - loss: 0.0108 - mae: 0.0108 - val_loss: 0.0129 - val_mae: 0.0129 - 172ms/epoch - 3ms/step
Epoch 125/200
66/66 - 0s - loss: 0.0076 - mae: 0.0076 - val_loss: 0.0032 - val_mae: 0.0032 - 193ms/epoch - 3ms/step
Epoch 126/200
66/66 - 0s - loss: 0.0039 - mae: 0.0039 - val_loss: 0.0037 - val_mae: 0.0037 - 149ms/epoch - 2ms/step
Epoch 127/200
66/66 - 0s - loss: 0.0061 - mae: 0.0061 - val_loss: 0.0070 - val_mae: 0.0070 - 107ms/epoch - 2ms/step
Epoch 128/200
66/66 - 0s - loss: 0.0059 - mae: 0.0059 - val_loss: 0.0035 - val_mae: 0.0035 - 86ms/epoch - 1ms/step
Epoch 129/200
66/66 - 0s - loss: 0.0043 - mae: 0.0043 - val_loss: 0.0030 - val_mae: 0.0030 - 159ms/epoch - 2ms/step
Epoch 130/200
66/66 - 0s - loss: 0.0068 - mae: 0.0068 - val_loss: 0.0033 - val_mae: 0.0033 - 115ms/epoch - 2ms/step
Epoch 131/200
66/66 - 0s - loss: 0.0058 - mae: 0.0058 - val_loss: 0.0067 - val_mae: 0.0067 - 134ms/epoch - 2ms/step
Epoch 132/200
66/66 - 0s - loss: 0.0052 - mae: 0.0052 - val_loss: 0.0068 - val_mae: 0.0068 - 141ms/epoch - 2ms/step
Epoch 133/200
66/66 - 0s - loss: 0.0062 - mae: 0.0062 - val_loss: 0.0078 - val_mae: 0.0078 - 96ms/epoch - 1ms/step
Epoch 134/200
66/66 - 0s - loss: 0.0056 - mae: 0.0056 - val_loss: 0.0061 - val_mae: 0.0061 - 117ms/epoch - 2ms/step
Epoch 135/200
66/66 - 0s - loss: 0.0059 - mae: 0.0059 - val_loss: 0.0065 - val_mae: 0.0065 - 104ms/epoch - 2ms/step
Epoch 136/200
66/66 - 0s - loss: 0.0059 - mae: 0.0059 - val_loss: 0.0046 - val_mae: 0.0046 - 129ms/epoch - 2ms/step
Epoch 137/200
66/66 - 0s - loss: 0.0043 - mae: 0.0043 - val_loss: 0.0037 - val_mae: 0.0037 - 131ms/epoch - 2ms/step
Epoch 138/200
66/66 - 0s - loss: 0.0038 - mae: 0.0038 - val_loss: 0.0032 - val_mae: 0.0032 - 172ms/epoch - 3ms/step
Epoch 139/200
66/66 - 0s - loss: 0.0041 - mae: 0.0041 - val_loss: 0.0049 - val_mae: 0.0049 - 95ms/epoch - 1ms/step
Epoch 140/200
66/66 - 0s - loss: 0.0064 - mae: 0.0064 - val_loss: 0.0064 - val_mae: 0.0064 - 89ms/epoch - 1ms/step
Epoch 141/200
66/66 - 0s - loss: 0.0044 - mae: 0.0044 - val_loss: 0.0044 - val_mae: 0.0044 - 106ms/epoch - 2ms/step
Epoch 142/200
66/66 - 0s - loss: 0.0045 - mae: 0.0045 - val_loss: 0.0033 - val_mae: 0.0033 - 152ms/epoch - 2ms/step
Epoch 143/200
66/66 - 0s - loss: 0.0053 - mae: 0.0053 - val_loss: 0.0036 - val_mae: 0.0036 - 125ms/epoch - 2ms/step
Epoch 144/200
66/66 - 0s - loss: 0.0050 - mae: 0.0050 - val_loss: 0.0121 - val_mae: 0.0121 - 120ms/epoch - 2ms/step
Epoch 145/200
66/66 - 0s - loss: 0.0042 - mae: 0.0042 - val_loss: 0.0045 - val_mae: 0.0045 - 133ms/epoch - 2ms/step
Epoch 146/200
66/66 - 0s - loss: 0.0045 - mae: 0.0045 - val_loss: 0.0031 - val_mae: 0.0031 - 113ms/epoch - 2ms/step
Epoch 147/200
66/66 - 0s - loss: 0.0040 - mae: 0.0040 - val_loss: 0.0074 - val_mae: 0.0074 - 111ms/epoch - 2ms/step
Epoch 148/200
66/66 - 0s - loss: 0.0055 - mae: 0.0055 - val_loss: 0.0040 - val_mae: 0.0040 - 175ms/epoch - 3ms/step
Epoch 149/200
66/66 - 0s - loss: 0.0069 - mae: 0.0069 - val_loss: 0.0172 - val_mae: 0.0172 - 111ms/epoch - 2ms/step
Epoch 150/200
66/66 - 0s - loss: 0.0047 - mae: 0.0047 - val_loss: 0.0031 - val_mae: 0.0031 - 92ms/epoch - 1ms/step
Epoch 151/200
66/66 - 0s - loss: 0.0042 - mae: 0.0042 - val_loss: 0.0064 - val_mae: 0.0064 - 109ms/epoch - 2ms/step
Epoch 152/200
66/66 - 0s - loss: 0.0058 - mae: 0.0058 - val_loss: 0.0035 - val_mae: 0.0035 - 118ms/epoch - 2ms/step
Epoch 153/200
66/66 - 0s - loss: 0.0056 - mae: 0.0056 - val_loss: 0.0030 - val_mae: 0.0030 - 142ms/epoch - 2ms/step
Epoch 154/200
66/66 - 0s - loss: 0.0073 - mae: 0.0073 - val_loss: 0.0038 - val_mae: 0.0038 - 158ms/epoch - 2ms/step
Epoch 155/200
66/66 - 0s - loss: 0.0049 - mae: 0.0049 - val_loss: 0.0027 - val_mae: 0.0027 - 181ms/epoch - 3ms/step
Epoch 156/200
66/66 - 0s - loss: 0.0046 - mae: 0.0046 - val_loss: 0.0040 - val_mae: 0.0040 - 183ms/epoch - 3ms/step
Epoch 157/200
66/66 - 0s - loss: 0.0040 - mae: 0.0040 - val_loss: 0.0045 - val_mae: 0.0045 - 119ms/epoch - 2ms/step
Epoch 158/200
66/66 - 0s - loss: 0.0059 - mae: 0.0059 - val_loss: 0.0069 - val_mae: 0.0069 - 111ms/epoch - 2ms/step
Epoch 159/200
66/66 - 0s - loss: 0.0038 - mae: 0.0038 - val_loss: 0.0024 - val_mae: 0.0024 - 186ms/epoch - 3ms/step
Epoch 160/200
66/66 - 0s - loss: 0.0060 - mae: 0.0060 - val_loss: 0.0027 - val_mae: 0.0027 - 175ms/epoch - 3ms/step
Epoch 161/200
66/66 - 0s - loss: 0.0046 - mae: 0.0046 - val_loss: 0.0028 - val_mae: 0.0028 - 149ms/epoch - 2ms/step
Epoch 162/200
66/66 - 0s - loss: 0.0034 - mae: 0.0034 - val_loss: 0.0033 - val_mae: 0.0033 - 142ms/epoch - 2ms/step
Epoch 163/200
66/66 - 0s - loss: 0.0040 - mae: 0.0040 - val_loss: 0.0038 - val_mae: 0.0038 - 100ms/epoch - 2ms/step
Epoch 164/200
66/66 - 0s - loss: 0.0055 - mae: 0.0055 - val_loss: 0.0057 - val_mae: 0.0057 - 152ms/epoch - 2ms/step
Epoch 165/200
66/66 - 0s - loss: 0.0064 - mae: 0.0064 - val_loss: 0.0024 - val_mae: 0.0024 - 115ms/epoch - 2ms/step
Epoch 166/200
66/66 - 0s - loss: 0.0059 - mae: 0.0059 - val_loss: 0.0030 - val_mae: 0.0030 - 112ms/epoch - 2ms/step
Epoch 167/200
66/66 - 0s - loss: 0.0036 - mae: 0.0036 - val_loss: 0.0028 - val_mae: 0.0028 - 85ms/epoch - 1ms/step
Epoch 168/200
66/66 - 0s - loss: 0.0039 - mae: 0.0039 - val_loss: 0.0031 - val_mae: 0.0031 - 91ms/epoch - 1ms/step
Epoch 169/200
66/66 - 0s - loss: 0.0067 - mae: 0.0067 - val_loss: 0.0050 - val_mae: 0.0050 - 107ms/epoch - 2ms/step
Epoch 170/200
66/66 - 0s - loss: 0.0044 - mae: 0.0044 - val_loss: 0.0060 - val_mae: 0.0060 - 103ms/epoch - 2ms/step
Epoch 171/200
66/66 - 0s - loss: 0.0063 - mae: 0.0063 - val_loss: 0.0061 - val_mae: 0.0061 - 137ms/epoch - 2ms/step
Epoch 172/200
66/66 - 0s - loss: 0.0059 - mae: 0.0059 - val_loss: 0.0027 - val_mae: 0.0027 - 111ms/epoch - 2ms/step
Epoch 173/200
66/66 - 0s - loss: 0.0038 - mae: 0.0038 - val_loss: 0.0025 - val_mae: 0.0025 - 111ms/epoch - 2ms/step
Epoch 174/200
66/66 - 0s - loss: 0.0058 - mae: 0.0058 - val_loss: 0.0076 - val_mae: 0.0076 - 97ms/epoch - 1ms/step
Epoch 175/200
66/66 - 0s - loss: 0.0058 - mae: 0.0058 - val_loss: 0.0071 - val_mae: 0.0071 - 104ms/epoch - 2ms/step
Epoch 176/200
66/66 - 0s - loss: 0.0048 - mae: 0.0048 - val_loss: 0.0046 - val_mae: 0.0046 - 153ms/epoch - 2ms/step
Epoch 177/200
66/66 - 0s - loss: 0.0035 - mae: 0.0035 - val_loss: 0.0029 - val_mae: 0.0029 - 107ms/epoch - 2ms/step
Epoch 178/200
66/66 - 0s - loss: 0.0039 - mae: 0.0039 - val_loss: 0.0023 - val_mae: 0.0023 - 171ms/epoch - 3ms/step
Epoch 179/200
66/66 - 0s - loss: 0.0072 - mae: 0.0072 - val_loss: 0.0032 - val_mae: 0.0032 - 174ms/epoch - 3ms/step
Epoch 180/200
66/66 - 0s - loss: 0.0041 - mae: 0.0041 - val_loss: 0.0027 - val_mae: 0.0027 - 161ms/epoch - 2ms/step
Epoch 181/200
66/66 - 0s - loss: 0.0035 - mae: 0.0035 - val_loss: 0.0042 - val_mae: 0.0042 - 123ms/epoch - 2ms/step
Epoch 182/200
66/66 - 0s - loss: 0.0070 - mae: 0.0070 - val_loss: 0.0036 - val_mae: 0.0036 - 82ms/epoch - 1ms/step
Epoch 183/200
66/66 - 0s - loss: 0.0052 - mae: 0.0052 - val_loss: 0.0036 - val_mae: 0.0036 - 121ms/epoch - 2ms/step
Epoch 184/200
66/66 - 0s - loss: 0.0045 - mae: 0.0045 - val_loss: 0.0036 - val_mae: 0.0036 - 127ms/epoch - 2ms/step
Epoch 185/200
66/66 - 0s - loss: 0.0042 - mae: 0.0042 - val_loss: 0.0025 - val_mae: 0.0025 - 93ms/epoch - 1ms/step
Epoch 186/200
66/66 - 0s - loss: 0.0035 - mae: 0.0035 - val_loss: 0.0022 - val_mae: 0.0022 - 133ms/epoch - 2ms/step
Epoch 187/200
66/66 - 0s - loss: 0.0029 - mae: 0.0029 - val_loss: 0.0017 - val_mae: 0.0017 - 190ms/epoch - 3ms/step
Epoch 188/200
66/66 - 0s - loss: 0.0072 - mae: 0.0072 - val_loss: 0.0028 - val_mae: 0.0028 - 151ms/epoch - 2ms/step
Epoch 189/200
66/66 - 0s - loss: 0.0055 - mae: 0.0055 - val_loss: 0.0025 - val_mae: 0.0025 - 89ms/epoch - 1ms/step
Epoch 190/200
66/66 - 0s - loss: 0.0075 - mae: 0.0075 - val_loss: 0.0076 - val_mae: 0.0076 - 92ms/epoch - 1ms/step
Epoch 191/200
66/66 - 0s - loss: 0.0035 - mae: 0.0035 - val_loss: 0.0021 - val_mae: 0.0021 - 164ms/epoch - 2ms/step
Epoch 192/200
66/66 - 0s - loss: 0.0052 - mae: 0.0052 - val_loss: 0.0065 - val_mae: 0.0065 - 117ms/epoch - 2ms/step
Epoch 193/200
66/66 - 0s - loss: 0.0062 - mae: 0.0062 - val_loss: 0.0052 - val_mae: 0.0052 - 124ms/epoch - 2ms/step
Epoch 194/200
66/66 - 0s - loss: 0.0054 - mae: 0.0054 - val_loss: 0.0094 - val_mae: 0.0094 - 99ms/epoch - 1ms/step
Epoch 195/200
66/66 - 0s - loss: 0.0058 - mae: 0.0058 - val_loss: 0.0026 - val_mae: 0.0026 - 132ms/epoch - 2ms/step
Epoch 196/200
66/66 - 0s - loss: 0.0045 - mae: 0.0045 - val_loss: 0.0042 - val_mae: 0.0042 - 118ms/epoch - 2ms/step
Epoch 197/200
66/66 - 0s - loss: 0.0041 - mae: 0.0041 - val_loss: 0.0023 - val_mae: 0.0023 - 152ms/epoch - 2ms/step
Epoch 198/200
66/66 - 0s - loss: 0.0033 - mae: 0.0033 - val_loss: 0.0020 - val_mae: 0.0020 - 127ms/epoch - 2ms/step
Epoch 199/200
66/66 - 0s - loss: 0.0047 - mae: 0.0047 - val_loss: 0.0023 - val_mae: 0.0023 - 161ms/epoch - 2ms/step
Epoch 200/200
66/66 - 0s - loss: 0.0041 - mae: 0.0041 - val_loss: 0.0103 - val_mae: 0.0103 - 175ms/epoch - 3ms/step
17/17 [==============================] - 0s 1ms/step
1247.4253521543549

Part 4.5: Results Analysis

In [59]:
acc = history.history['mae']
val_acc = history.history['val_mae']
loss = history.history['loss']
val_loss = history.history['val_loss']
epoch = list(range(len(acc)))

fig_acc = go.Figure()

fig_acc.add_trace(go.Line(
    x = epoch,
    y = acc,
    legendgroup = 'Plot',
    legendgrouptitle_text = 'Plot',
    name = 'Accuracy',
    marker = dict(color='red'),
))

fig_acc.add_trace(go.Line(
    x = epoch,
    y = val_acc,
    legendgroup = 'Plot',
    legendgrouptitle_text = 'Plot',
    name = 'Validation Accuracy',
    marker = dict(color='orange'),
))

fig_acc.layout.update(title_text='Accuracy and Validation Accuracy Plot',xaxis_title='Epoch',yaxis_title='Accuracy')
fig_acc.show()

fig_loss = go.Figure()

fig_loss.add_trace(go.Line(
    x = epoch,
    y = loss,
    legendgroup = 'Plot',
    legendgrouptitle_text = 'Plot',
    name = 'Loss',
    marker = dict(color='red'),
))

fig_loss.add_trace(go.Line(
    x = epoch,
    y = val_loss,
    legendgroup = 'Plot',
    legendgrouptitle_text = 'Plot',
    name = 'Validation Loss',
    marker = dict(color='orange'),
))

fig_loss.layout.update(title_text='Loss and Validation Loss Plot',xaxis_title='Epoch',yaxis_title='Loss')
fig_loss.show()